Mar 16 00:06:30 crc systemd[1]: Starting Kubernetes Kubelet... Mar 16 00:06:30 crc restorecon[4697]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 16 00:06:31 crc kubenswrapper[4983]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 16 00:06:31 crc kubenswrapper[4983]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 16 00:06:31 crc kubenswrapper[4983]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 16 00:06:31 crc kubenswrapper[4983]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 16 00:06:31 crc kubenswrapper[4983]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 16 00:06:31 crc kubenswrapper[4983]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.825207 4983 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837232 4983 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837275 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837284 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837294 4983 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837303 4983 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837312 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837321 4983 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837329 4983 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837336 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837344 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837352 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837359 4983 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837367 4983 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837375 4983 feature_gate.go:330] unrecognized feature gate: Example Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837385 4983 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837397 4983 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837405 4983 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837413 4983 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837421 4983 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837428 4983 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837438 4983 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837445 4983 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837453 4983 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837460 4983 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837468 4983 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837476 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837483 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837491 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837498 4983 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837509 4983 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837518 4983 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837527 4983 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837535 4983 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837554 4983 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837562 4983 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837571 4983 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837579 4983 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837587 4983 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837596 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837604 4983 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837611 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837619 4983 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837627 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837634 4983 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837642 4983 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837649 4983 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837657 4983 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837667 4983 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837676 4983 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837683 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837691 4983 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837699 4983 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837709 4983 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837718 4983 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837727 4983 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837735 4983 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837742 4983 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837750 4983 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837792 4983 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837802 4983 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837812 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837822 4983 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837832 4983 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837841 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837849 4983 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837857 4983 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837864 4983 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837872 4983 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837880 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837890 4983 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837903 4983 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838118 4983 flags.go:64] FLAG: --address="0.0.0.0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838143 4983 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838169 4983 flags.go:64] FLAG: --anonymous-auth="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838181 4983 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838192 4983 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838201 4983 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838212 4983 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838223 4983 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838232 4983 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838241 4983 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838251 4983 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838282 4983 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838293 4983 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838302 4983 flags.go:64] FLAG: --cgroup-root="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838311 4983 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838321 4983 flags.go:64] FLAG: --client-ca-file="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838330 4983 flags.go:64] FLAG: --cloud-config="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838340 4983 flags.go:64] FLAG: --cloud-provider="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838349 4983 flags.go:64] FLAG: --cluster-dns="[]" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838374 4983 flags.go:64] FLAG: --cluster-domain="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838383 4983 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838392 4983 flags.go:64] FLAG: --config-dir="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838401 4983 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838411 4983 flags.go:64] FLAG: --container-log-max-files="5" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838430 4983 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838439 4983 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838448 4983 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838459 4983 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838468 4983 flags.go:64] FLAG: --contention-profiling="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838477 4983 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838486 4983 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838497 4983 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838520 4983 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838531 4983 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838541 4983 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838550 4983 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838559 4983 flags.go:64] FLAG: --enable-load-reader="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838568 4983 flags.go:64] FLAG: --enable-server="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838578 4983 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838596 4983 flags.go:64] FLAG: --event-burst="100" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838606 4983 flags.go:64] FLAG: --event-qps="50" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838615 4983 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838624 4983 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838634 4983 flags.go:64] FLAG: --eviction-hard="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838645 4983 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838654 4983 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838663 4983 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838683 4983 flags.go:64] FLAG: --eviction-soft="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838693 4983 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838702 4983 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838711 4983 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838719 4983 flags.go:64] FLAG: --experimental-mounter-path="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838730 4983 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838742 4983 flags.go:64] FLAG: --fail-swap-on="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838791 4983 flags.go:64] FLAG: --feature-gates="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838808 4983 flags.go:64] FLAG: --file-check-frequency="20s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838820 4983 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838829 4983 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838838 4983 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838848 4983 flags.go:64] FLAG: --healthz-port="10248" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838857 4983 flags.go:64] FLAG: --help="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838866 4983 flags.go:64] FLAG: --hostname-override="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838875 4983 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838884 4983 flags.go:64] FLAG: --http-check-frequency="20s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838898 4983 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838906 4983 flags.go:64] FLAG: --image-credential-provider-config="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838915 4983 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838924 4983 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838932 4983 flags.go:64] FLAG: --image-service-endpoint="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838941 4983 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838949 4983 flags.go:64] FLAG: --kube-api-burst="100" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838960 4983 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838971 4983 flags.go:64] FLAG: --kube-api-qps="50" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838982 4983 flags.go:64] FLAG: --kube-reserved="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838994 4983 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839004 4983 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839015 4983 flags.go:64] FLAG: --kubelet-cgroups="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839024 4983 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839032 4983 flags.go:64] FLAG: --lock-file="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839041 4983 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839050 4983 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839059 4983 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839073 4983 flags.go:64] FLAG: --log-json-split-stream="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839097 4983 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839106 4983 flags.go:64] FLAG: --log-text-split-stream="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839116 4983 flags.go:64] FLAG: --logging-format="text" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839124 4983 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839134 4983 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839142 4983 flags.go:64] FLAG: --manifest-url="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839151 4983 flags.go:64] FLAG: --manifest-url-header="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839163 4983 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839172 4983 flags.go:64] FLAG: --max-open-files="1000000" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839182 4983 flags.go:64] FLAG: --max-pods="110" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839191 4983 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839201 4983 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839210 4983 flags.go:64] FLAG: --memory-manager-policy="None" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839222 4983 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839231 4983 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839241 4983 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839249 4983 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839269 4983 flags.go:64] FLAG: --node-status-max-images="50" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839278 4983 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839286 4983 flags.go:64] FLAG: --oom-score-adj="-999" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839296 4983 flags.go:64] FLAG: --pod-cidr="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839305 4983 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839317 4983 flags.go:64] FLAG: --pod-manifest-path="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839326 4983 flags.go:64] FLAG: --pod-max-pids="-1" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839335 4983 flags.go:64] FLAG: --pods-per-core="0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839344 4983 flags.go:64] FLAG: --port="10250" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839353 4983 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839362 4983 flags.go:64] FLAG: --provider-id="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839371 4983 flags.go:64] FLAG: --qos-reserved="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839380 4983 flags.go:64] FLAG: --read-only-port="10255" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839389 4983 flags.go:64] FLAG: --register-node="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839398 4983 flags.go:64] FLAG: --register-schedulable="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839407 4983 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839421 4983 flags.go:64] FLAG: --registry-burst="10" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839430 4983 flags.go:64] FLAG: --registry-qps="5" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839439 4983 flags.go:64] FLAG: --reserved-cpus="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839460 4983 flags.go:64] FLAG: --reserved-memory="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839471 4983 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839482 4983 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839491 4983 flags.go:64] FLAG: --rotate-certificates="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839500 4983 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839509 4983 flags.go:64] FLAG: --runonce="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839518 4983 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839528 4983 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839537 4983 flags.go:64] FLAG: --seccomp-default="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839550 4983 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839559 4983 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839569 4983 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839578 4983 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839588 4983 flags.go:64] FLAG: --storage-driver-password="root" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839596 4983 flags.go:64] FLAG: --storage-driver-secure="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839605 4983 flags.go:64] FLAG: --storage-driver-table="stats" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839613 4983 flags.go:64] FLAG: --storage-driver-user="root" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839623 4983 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839631 4983 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839640 4983 flags.go:64] FLAG: --system-cgroups="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839649 4983 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839662 4983 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839672 4983 flags.go:64] FLAG: --tls-cert-file="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839681 4983 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839697 4983 flags.go:64] FLAG: --tls-min-version="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839706 4983 flags.go:64] FLAG: --tls-private-key-file="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839714 4983 flags.go:64] FLAG: --topology-manager-policy="none" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839723 4983 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839733 4983 flags.go:64] FLAG: --topology-manager-scope="container" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839742 4983 flags.go:64] FLAG: --v="2" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839780 4983 flags.go:64] FLAG: --version="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839792 4983 flags.go:64] FLAG: --vmodule="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839803 4983 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839813 4983 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840051 4983 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840064 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840084 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840092 4983 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840101 4983 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840115 4983 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840123 4983 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840133 4983 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840141 4983 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840149 4983 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840157 4983 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840165 4983 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840172 4983 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840180 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840188 4983 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840196 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840204 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840212 4983 feature_gate.go:330] unrecognized feature gate: Example Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840220 4983 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840227 4983 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840238 4983 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840248 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840257 4983 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840266 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840274 4983 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840282 4983 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840290 4983 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840298 4983 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840305 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840313 4983 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840321 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840328 4983 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840336 4983 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840343 4983 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840352 4983 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840360 4983 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840367 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840377 4983 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840396 4983 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840407 4983 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840415 4983 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840423 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840431 4983 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840439 4983 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840446 4983 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840456 4983 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840465 4983 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840473 4983 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840481 4983 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840490 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840497 4983 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840505 4983 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840513 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840523 4983 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840533 4983 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840542 4983 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840552 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840560 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840569 4983 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840577 4983 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840585 4983 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840593 4983 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840602 4983 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840610 4983 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840617 4983 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840625 4983 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840633 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840640 4983 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840648 4983 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840661 4983 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840670 4983 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.840697 4983 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.854562 4983 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.854620 4983 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854747 4983 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854806 4983 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854821 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854838 4983 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854852 4983 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854863 4983 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854874 4983 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854884 4983 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854893 4983 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854903 4983 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854915 4983 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854926 4983 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854938 4983 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854947 4983 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854959 4983 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854969 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854979 4983 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854988 4983 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854996 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855006 4983 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855015 4983 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855023 4983 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855032 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855042 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855051 4983 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855059 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855068 4983 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855078 4983 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855087 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855099 4983 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855108 4983 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855117 4983 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855127 4983 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855136 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855145 4983 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855154 4983 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855162 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855171 4983 feature_gate.go:330] unrecognized feature gate: Example Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855179 4983 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855187 4983 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855196 4983 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855204 4983 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855212 4983 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855220 4983 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855228 4983 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855237 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855246 4983 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855254 4983 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855263 4983 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855271 4983 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855280 4983 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855288 4983 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855297 4983 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855305 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855314 4983 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855322 4983 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855330 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855342 4983 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855352 4983 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855365 4983 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855376 4983 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855387 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855397 4983 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855406 4983 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855415 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855424 4983 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855433 4983 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855442 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855452 4983 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855464 4983 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855473 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.855488 4983 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855746 4983 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855802 4983 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855814 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855824 4983 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855833 4983 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855841 4983 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855852 4983 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855861 4983 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855870 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855879 4983 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855888 4983 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855896 4983 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855905 4983 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855913 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855922 4983 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855930 4983 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855938 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855946 4983 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855955 4983 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855967 4983 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855978 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855988 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855998 4983 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856008 4983 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856018 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856027 4983 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856036 4983 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856044 4983 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856052 4983 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856062 4983 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856072 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856080 4983 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856088 4983 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856096 4983 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856104 4983 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856113 4983 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856121 4983 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856129 4983 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856140 4983 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856151 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856160 4983 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856170 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856179 4983 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856187 4983 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856195 4983 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856203 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856212 4983 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856220 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856228 4983 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856237 4983 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856246 4983 feature_gate.go:330] unrecognized feature gate: Example Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856254 4983 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856264 4983 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856272 4983 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856280 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856288 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856298 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856310 4983 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856324 4983 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856334 4983 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856344 4983 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856355 4983 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856363 4983 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856372 4983 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856380 4983 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856390 4983 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856399 4983 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856408 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856416 4983 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856424 4983 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856432 4983 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.856447 4983 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.856731 4983 server.go:940] "Client rotation is on, will bootstrap in background" Mar 16 00:06:31 crc kubenswrapper[4983]: E0316 00:06:31.862527 4983 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.868372 4983 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.868538 4983 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.870616 4983 server.go:997] "Starting client certificate rotation" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.870681 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.870930 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.900460 4983 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 16 00:06:31 crc kubenswrapper[4983]: E0316 00:06:31.904501 4983 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.904560 4983 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.921628 4983 log.go:25] "Validated CRI v1 runtime API" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.955170 4983 log.go:25] "Validated CRI v1 image API" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.960084 4983 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.964823 4983 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-16-00-00-46-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.964870 4983 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.991220 4983 manager.go:217] Machine: {Timestamp:2026-03-16 00:06:31.987917689 +0000 UTC m=+0.588016189 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2ead470a-f520-44aa-9efc-f4170c7efbf2 BootID:07bf7a14-97e0-4c5e-b357-db0b2f7bca2e Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:eb:47:27 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:eb:47:27 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:76:fd:9d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:dd:62:62 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:54:44:df Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:7a:50:f1 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:72:9a:cb:14:3c:2c Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:da:ce:d3:45:61:cc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.991592 4983 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.991818 4983 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.994608 4983 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.994948 4983 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.995007 4983 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.995303 4983 topology_manager.go:138] "Creating topology manager with none policy" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.995320 4983 container_manager_linux.go:303] "Creating device plugin manager" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.996514 4983 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.996562 4983 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.996907 4983 state_mem.go:36] "Initialized new in-memory state store" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.997043 4983 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.001923 4983 kubelet.go:418] "Attempting to sync node with API server" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.001968 4983 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.002007 4983 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.002027 4983 kubelet.go:324] "Adding apiserver pod source" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.002045 4983 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.005809 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.005915 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.005846 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.006254 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.007002 4983 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.008130 4983 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.012562 4983 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014076 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014157 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014211 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014274 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014328 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014385 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014433 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014484 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014533 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014581 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014636 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014694 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.016930 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.017996 4983 server.go:1280] "Started kubelet" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.022907 4983 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.022599 4983 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 16 00:06:32 crc systemd[1]: Started Kubernetes Kubelet. Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.024406 4983 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.025521 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.028055 4983 server.go:460] "Adding debug handlers to kubelet server" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.033516 4983 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.033592 4983 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.034066 4983 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.034089 4983 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.034142 4983 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.034553 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.034622 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.034931 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.034964 4983 factory.go:55] Registering systemd factory Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.035286 4983 factory.go:221] Registration of the systemd container factory successfully Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.035842 4983 factory.go:153] Registering CRI-O factory Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.035880 4983 factory.go:221] Registration of the crio container factory successfully Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.035960 4983 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.035997 4983 factory.go:103] Registering Raw factory Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.036016 4983 manager.go:1196] Started watching for new ooms in manager Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.036828 4983 manager.go:319] Starting recovery of all containers Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.034044 4983 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189d299f34e6fcf6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,LastTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.036870 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.041866 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.041931 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.041952 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.041972 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.041984 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042002 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042016 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042029 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042044 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042057 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042074 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042086 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042100 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042115 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042210 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042231 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042248 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042262 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042275 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042316 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.045708 4983 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.045907 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046043 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046127 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046211 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046294 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046385 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046514 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046640 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046749 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046874 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046958 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047043 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047137 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047219 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047440 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047559 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047652 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047732 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047843 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047937 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048017 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048322 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048448 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048608 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048713 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048826 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048928 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049011 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049092 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049200 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049296 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049401 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049500 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049605 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049711 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049843 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049935 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050025 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050109 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050188 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050275 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050379 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050487 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050583 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050688 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050808 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050905 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051001 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051083 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051168 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051250 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051350 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051455 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051542 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051629 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051710 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051851 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051951 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052096 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052182 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052262 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052348 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052428 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052510 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052590 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052672 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052805 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052905 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053041 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053160 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053249 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053328 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053426 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053521 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053654 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053803 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053890 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053999 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.054288 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.054388 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.054478 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.054561 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.055286 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056456 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056578 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056605 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056639 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056669 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056688 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056718 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056745 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056788 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056806 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056825 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056844 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056860 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057362 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057388 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057416 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057435 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057462 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057487 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057512 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057542 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057564 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057587 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057614 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057636 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057668 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057692 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057711 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057732 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057769 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057793 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057813 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057832 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057861 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057887 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057914 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057940 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057966 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057993 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058013 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058038 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058056 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058082 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058117 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058136 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058156 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058176 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058193 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058214 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058230 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058250 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058274 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058290 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058308 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058325 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058338 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058356 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058368 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058390 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058406 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058426 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058489 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058506 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058529 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058547 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058561 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058588 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058601 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058615 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058632 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058646 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058667 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058684 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058698 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058714 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058729 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058765 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058784 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058798 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058814 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058826 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058843 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058856 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058868 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058885 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058897 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058919 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058937 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058953 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058970 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058982 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059006 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059018 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059033 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059050 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059062 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059078 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059092 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059105 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059121 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059139 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059150 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059166 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059180 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059195 4983 reconstruct.go:97] "Volume reconstruction finished" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059204 4983 reconciler.go:26] "Reconciler: start to sync state" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.064710 4983 manager.go:324] Recovery completed Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.079087 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.080835 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.080883 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.080895 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.082339 4983 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.082657 4983 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.082783 4983 state_mem.go:36] "Initialized new in-memory state store" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.089629 4983 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.091253 4983 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.091304 4983 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.091338 4983 kubelet.go:2335] "Starting kubelet main sync loop" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.091525 4983 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.093879 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.093998 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.101465 4983 policy_none.go:49] "None policy: Start" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.102645 4983 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.102701 4983 state_mem.go:35] "Initializing new in-memory state store" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.135534 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.156998 4983 manager.go:334] "Starting Device Plugin manager" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.157114 4983 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.157138 4983 server.go:79] "Starting device plugin registration server" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.158007 4983 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.158040 4983 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.158328 4983 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.158469 4983 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.158492 4983 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.167128 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.192519 4983 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.192701 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.194695 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.194736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.194766 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.194962 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.195133 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.195186 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.195996 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.196035 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.196044 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.196133 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.196545 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.196569 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197007 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197181 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197208 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197240 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197330 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197359 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197372 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197391 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197440 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197471 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197922 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198084 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198397 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198479 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198478 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198570 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198588 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198505 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198831 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198943 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198971 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.200604 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.200623 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.200632 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.200998 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.201051 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.201063 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.201343 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.201405 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.202447 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.202472 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.202481 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.239877 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.258816 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.260080 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.260153 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.260179 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.260235 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.261086 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261216 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261316 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261374 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261413 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261453 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261487 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261586 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261682 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261729 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261791 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261834 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261863 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261882 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261926 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261954 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363289 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363414 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363478 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363520 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363535 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363561 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363603 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363645 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363660 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363691 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363733 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.363550 4983 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189d299f34e6fcf6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,LastTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363737 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363867 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363881 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363896 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363956 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363978 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363936 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364053 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363941 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363932 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363896 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363864 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364271 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364310 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364551 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364618 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364729 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364830 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364910 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.461623 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.464030 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.464081 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.464098 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.464139 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.464815 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.518302 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.525343 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.545185 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.569221 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-38b9c4b82dacfca28c66a23c0c39b80afdf00c14c9cdc7cae09e7662b0f01564 WatchSource:0}: Error finding container 38b9c4b82dacfca28c66a23c0c39b80afdf00c14c9cdc7cae09e7662b0f01564: Status 404 returned error can't find the container with id 38b9c4b82dacfca28c66a23c0c39b80afdf00c14c9cdc7cae09e7662b0f01564 Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.569516 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.572959 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-892b62111794c7ab545a80c0afb380ed7cc2f821a9903a1312a809a47a88e8d9 WatchSource:0}: Error finding container 892b62111794c7ab545a80c0afb380ed7cc2f821a9903a1312a809a47a88e8d9: Status 404 returned error can't find the container with id 892b62111794c7ab545a80c0afb380ed7cc2f821a9903a1312a809a47a88e8d9 Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.579923 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.585243 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-8cad439d2f13373ddf0a9f9dafc0ab855b098fa917e6a8b1d9bd3fd177c03009 WatchSource:0}: Error finding container 8cad439d2f13373ddf0a9f9dafc0ab855b098fa917e6a8b1d9bd3fd177c03009: Status 404 returned error can't find the container with id 8cad439d2f13373ddf0a9f9dafc0ab855b098fa917e6a8b1d9bd3fd177c03009 Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.593231 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e5895ae1ce3134ff6b7749cde7865a85f99287ea2067518bf4fe851c7db5b129 WatchSource:0}: Error finding container e5895ae1ce3134ff6b7749cde7865a85f99287ea2067518bf4fe851c7db5b129: Status 404 returned error can't find the container with id e5895ae1ce3134ff6b7749cde7865a85f99287ea2067518bf4fe851c7db5b129 Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.613942 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-266776467af6a2a96a278c6ceb97290c41905a1999a499481d0b3226b5671daf WatchSource:0}: Error finding container 266776467af6a2a96a278c6ceb97290c41905a1999a499481d0b3226b5671daf: Status 404 returned error can't find the container with id 266776467af6a2a96a278c6ceb97290c41905a1999a499481d0b3226b5671daf Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.641902 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.843361 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.843504 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.865813 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.867513 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.867596 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.867636 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.867689 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.868523 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.027187 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.095735 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"38b9c4b82dacfca28c66a23c0c39b80afdf00c14c9cdc7cae09e7662b0f01564"} Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.096806 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"266776467af6a2a96a278c6ceb97290c41905a1999a499481d0b3226b5671daf"} Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.097986 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e5895ae1ce3134ff6b7749cde7865a85f99287ea2067518bf4fe851c7db5b129"} Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.099063 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8cad439d2f13373ddf0a9f9dafc0ab855b098fa917e6a8b1d9bd3fd177c03009"} Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.101206 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"892b62111794c7ab545a80c0afb380ed7cc2f821a9903a1312a809a47a88e8d9"} Mar 16 00:06:33 crc kubenswrapper[4983]: W0316 00:06:33.323103 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:33 crc kubenswrapper[4983]: E0316 00:06:33.323516 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:33 crc kubenswrapper[4983]: W0316 00:06:33.413051 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:33 crc kubenswrapper[4983]: E0316 00:06:33.413141 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:33 crc kubenswrapper[4983]: W0316 00:06:33.424386 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:33 crc kubenswrapper[4983]: E0316 00:06:33.424598 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:33 crc kubenswrapper[4983]: E0316 00:06:33.442921 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.669486 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.672519 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.672587 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.672602 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.672651 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:33 crc kubenswrapper[4983]: E0316 00:06:33.673337 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.026845 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.089743 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:06:34 crc kubenswrapper[4983]: E0316 00:06:34.091243 4983 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.105321 4983 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173" exitCode=0 Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.105463 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.105487 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.106470 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.106495 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.106506 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.108718 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.108746 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.108779 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.108791 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.108803 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.110888 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.110944 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.110957 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.111608 4983 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce" exitCode=0 Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.111678 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.111737 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.112870 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.112909 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.112923 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.112931 4983 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10" exitCode=0 Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.112985 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.113247 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114111 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719" exitCode=0 Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114130 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114162 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114183 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114195 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114203 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114942 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114965 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114975 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.116596 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.118062 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.118104 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.118116 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.026482 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:35 crc kubenswrapper[4983]: E0316 00:06:35.044614 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="3.2s" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.124194 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.124250 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.124264 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.124276 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.126120 4983 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230" exitCode=0 Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.126209 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.126392 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.127334 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.127373 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.127386 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.130089 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.130130 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.131117 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.131145 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.131156 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.138833 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.138905 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.138789 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.139343 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.139382 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.139874 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.139907 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.139920 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.140464 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.140540 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.140556 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.274260 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.276309 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.276374 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.276388 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.276424 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:35 crc kubenswrapper[4983]: E0316 00:06:35.277153 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.906821 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.151654 4983 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236" exitCode=0 Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.151836 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236"} Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.152097 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.153528 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.153588 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.153607 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.159586 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6dda0f9e5f13f6251926769a0d785e383946b7af1a7bab692e1c018a88e171ed"} Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.159623 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.159720 4983 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.159809 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.159729 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.159828 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.162151 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.162207 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.162233 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.163845 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.163896 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.163916 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.164874 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.164913 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.164931 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.165169 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.165231 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.165252 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.300113 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.014958 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.170191 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180"} Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.170261 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31"} Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.170281 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8"} Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.170312 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.170389 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.172079 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.172128 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.172175 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.172127 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.172245 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.172287 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.312594 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.312871 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.314474 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.314549 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.314575 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.170694 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.179248 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961"} Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.179384 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32"} Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.179318 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.179305 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.181171 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.181214 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.181231 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.181384 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.181481 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.181508 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.345991 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.477920 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.480140 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.480211 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.480236 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.480280 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.676651 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.181677 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.181814 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.182857 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.182908 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.182925 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.183247 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.183268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.183278 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.184442 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.185382 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.185458 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.185477 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.786052 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.786258 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.787521 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.787600 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.787622 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:42 crc kubenswrapper[4983]: E0316 00:06:42.167273 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:06:42 crc kubenswrapper[4983]: I0316 00:06:42.357323 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:42 crc kubenswrapper[4983]: I0316 00:06:42.357850 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:42 crc kubenswrapper[4983]: I0316 00:06:42.360103 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:42 crc kubenswrapper[4983]: I0316 00:06:42.360166 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:42 crc kubenswrapper[4983]: I0316 00:06:42.360188 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:43 crc kubenswrapper[4983]: I0316 00:06:43.708099 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:43 crc kubenswrapper[4983]: I0316 00:06:43.708344 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:43 crc kubenswrapper[4983]: I0316 00:06:43.710115 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:43 crc kubenswrapper[4983]: I0316 00:06:43.710168 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:43 crc kubenswrapper[4983]: I0316 00:06:43.710187 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:43 crc kubenswrapper[4983]: I0316 00:06:43.715750 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:44 crc kubenswrapper[4983]: I0316 00:06:44.195322 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:44 crc kubenswrapper[4983]: I0316 00:06:44.196902 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:44 crc kubenswrapper[4983]: I0316 00:06:44.196955 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:44 crc kubenswrapper[4983]: I0316 00:06:44.196970 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:44 crc kubenswrapper[4983]: I0316 00:06:44.200849 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.197812 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.199058 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.199118 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.199138 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.358257 4983 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.358331 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:06:45 crc kubenswrapper[4983]: W0316 00:06:45.561398 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.561504 4983 trace.go:236] Trace[878499469]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Mar-2026 00:06:35.559) (total time: 10001ms): Mar 16 00:06:45 crc kubenswrapper[4983]: Trace[878499469]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:06:45.561) Mar 16 00:06:45 crc kubenswrapper[4983]: Trace[878499469]: [10.001558097s] [10.001558097s] END Mar 16 00:06:45 crc kubenswrapper[4983]: E0316 00:06:45.561530 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 16 00:06:45 crc kubenswrapper[4983]: W0316 00:06:45.580186 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.580326 4983 trace.go:236] Trace[577567899]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Mar-2026 00:06:35.578) (total time: 10001ms): Mar 16 00:06:45 crc kubenswrapper[4983]: Trace[577567899]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:06:45.580) Mar 16 00:06:45 crc kubenswrapper[4983]: Trace[577567899]: [10.001678111s] [10.001678111s] END Mar 16 00:06:45 crc kubenswrapper[4983]: E0316 00:06:45.580362 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 16 00:06:45 crc kubenswrapper[4983]: W0316 00:06:45.948169 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.948287 4983 trace.go:236] Trace[254954146]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Mar-2026 00:06:35.946) (total time: 10001ms): Mar 16 00:06:45 crc kubenswrapper[4983]: Trace[254954146]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:06:45.948) Mar 16 00:06:45 crc kubenswrapper[4983]: Trace[254954146]: [10.001825475s] [10.001825475s] END Mar 16 00:06:45 crc kubenswrapper[4983]: E0316 00:06:45.948316 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 16 00:06:45 crc kubenswrapper[4983]: E0316 00:06:45.992016 4983 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:45Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d299f34e6fcf6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,LastTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.996444 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:45Z is after 2026-02-23T05:33:13Z Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.998059 4983 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.998136 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 16 00:06:46 crc kubenswrapper[4983]: E0316 00:06:46.000981 4983 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:06:46 crc kubenswrapper[4983]: W0316 00:06:46.003336 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:46Z is after 2026-02-23T05:33:13Z Mar 16 00:06:46 crc kubenswrapper[4983]: E0316 00:06:46.003424 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.004132 4983 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.004180 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 16 00:06:46 crc kubenswrapper[4983]: E0316 00:06:46.009106 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:46Z is after 2026-02-23T05:33:13Z" node="crc" Mar 16 00:06:46 crc kubenswrapper[4983]: E0316 00:06:46.011211 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:46Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.029867 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:46Z is after 2026-02-23T05:33:13Z Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.202129 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.203407 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6dda0f9e5f13f6251926769a0d785e383946b7af1a7bab692e1c018a88e171ed" exitCode=255 Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.203454 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6dda0f9e5f13f6251926769a0d785e383946b7af1a7bab692e1c018a88e171ed"} Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.203618 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.204398 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.204425 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.204433 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.204959 4983 scope.go:117] "RemoveContainer" containerID="6dda0f9e5f13f6251926769a0d785e383946b7af1a7bab692e1c018a88e171ed" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.906854 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.031289 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:47Z is after 2026-02-23T05:33:13Z Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.049737 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.049968 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.051619 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.051673 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.051689 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.109943 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.208098 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.209431 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.209856 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0"} Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.210029 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.211110 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.211328 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.211347 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.213231 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.213273 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.213288 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.226253 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.031610 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:48Z is after 2026-02-23T05:33:13Z Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.214132 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.215571 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.216727 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" exitCode=255 Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.216913 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.217337 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0"} Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.217378 4983 scope.go:117] "RemoveContainer" containerID="6dda0f9e5f13f6251926769a0d785e383946b7af1a7bab692e1c018a88e171ed" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.217892 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.218005 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.218110 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.218563 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.219362 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.219398 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.219412 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.220029 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:48 crc kubenswrapper[4983]: E0316 00:06:48.220223 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.032174 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:49Z is after 2026-02-23T05:33:13Z Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.222135 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.224750 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.225944 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.226176 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.226363 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.227530 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:49 crc kubenswrapper[4983]: E0316 00:06:49.228028 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:06:49 crc kubenswrapper[4983]: W0316 00:06:49.569890 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:49Z is after 2026-02-23T05:33:13Z Mar 16 00:06:49 crc kubenswrapper[4983]: E0316 00:06:49.570006 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:06:49 crc kubenswrapper[4983]: W0316 00:06:49.904328 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:49Z is after 2026-02-23T05:33:13Z Mar 16 00:06:49 crc kubenswrapper[4983]: E0316 00:06:49.904427 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.031604 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:50Z is after 2026-02-23T05:33:13Z Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.790603 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.790790 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.792087 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.792129 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.792140 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.792848 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:50 crc kubenswrapper[4983]: E0316 00:06:50.793027 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.799986 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:51 crc kubenswrapper[4983]: I0316 00:06:51.031559 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:51Z is after 2026-02-23T05:33:13Z Mar 16 00:06:51 crc kubenswrapper[4983]: I0316 00:06:51.229614 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:51 crc kubenswrapper[4983]: I0316 00:06:51.230691 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:51 crc kubenswrapper[4983]: I0316 00:06:51.230721 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:51 crc kubenswrapper[4983]: I0316 00:06:51.230731 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:51 crc kubenswrapper[4983]: I0316 00:06:51.231250 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:51 crc kubenswrapper[4983]: E0316 00:06:51.231404 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:06:51 crc kubenswrapper[4983]: W0316 00:06:51.691316 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:51Z is after 2026-02-23T05:33:13Z Mar 16 00:06:51 crc kubenswrapper[4983]: E0316 00:06:51.691426 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:06:52 crc kubenswrapper[4983]: I0316 00:06:52.033578 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:52 crc kubenswrapper[4983]: W0316 00:06:52.078042 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 16 00:06:52 crc kubenswrapper[4983]: E0316 00:06:52.078129 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:06:52 crc kubenswrapper[4983]: E0316 00:06:52.167437 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:06:52 crc kubenswrapper[4983]: I0316 00:06:52.409401 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:52 crc kubenswrapper[4983]: I0316 00:06:52.410696 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:52 crc kubenswrapper[4983]: I0316 00:06:52.410736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:52 crc kubenswrapper[4983]: I0316 00:06:52.410747 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:52 crc kubenswrapper[4983]: I0316 00:06:52.410796 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:52 crc kubenswrapper[4983]: E0316 00:06:52.415537 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:06:52 crc kubenswrapper[4983]: E0316 00:06:52.415884 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:06:53 crc kubenswrapper[4983]: I0316 00:06:53.035084 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:54 crc kubenswrapper[4983]: I0316 00:06:54.030438 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:54 crc kubenswrapper[4983]: I0316 00:06:54.363669 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:06:54 crc kubenswrapper[4983]: I0316 00:06:54.386516 4983 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 16 00:06:55 crc kubenswrapper[4983]: I0316 00:06:55.033655 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:55 crc kubenswrapper[4983]: I0316 00:06:55.359515 4983 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:06:55 crc kubenswrapper[4983]: I0316 00:06:55.359630 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.003575 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f34e6fcf6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,LastTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.010813 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.017691 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.024981 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.031597 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.032081 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f3d6059ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.160106991 +0000 UTC m=+0.760205441,LastTimestamp:2026-03-16 00:06:32.160106991 +0000 UTC m=+0.760205441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.036173 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.194719643 +0000 UTC m=+0.794818073,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.039842 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.194742375 +0000 UTC m=+0.794840805,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.045947 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.194772117 +0000 UTC m=+0.794870547,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.052358 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.196029329 +0000 UTC m=+0.796127759,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.058960 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.19604111 +0000 UTC m=+0.796139540,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.066968 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.196049971 +0000 UTC m=+0.796148401,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.074292 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.197179585 +0000 UTC m=+0.797278065,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.077544 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.197201526 +0000 UTC m=+0.797299956,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.081051 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.197215647 +0000 UTC m=+0.797314077,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.087540 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.197246259 +0000 UTC m=+0.797344689,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.089438 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.197360637 +0000 UTC m=+0.797459087,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.096361 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.197384388 +0000 UTC m=+0.797482838,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.104314 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.197400279 +0000 UTC m=+0.797498719,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.111310 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.198047742 +0000 UTC m=+0.798146212,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.118520 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.198220053 +0000 UTC m=+0.798318523,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.125340 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.198437537 +0000 UTC m=+0.798536017,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.132008 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.198495981 +0000 UTC m=+0.798594451,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.138506 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.198519163 +0000 UTC m=+0.798617603,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.144943 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.198581967 +0000 UTC m=+0.798680407,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.151620 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.198596568 +0000 UTC m=+0.798695008,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.160105 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299f561dbc08 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.575171592 +0000 UTC m=+1.175270052,LastTimestamp:2026-03-16 00:06:32.575171592 +0000 UTC m=+1.175270052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.167085 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d299f56606be5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.579541989 +0000 UTC m=+1.179640459,LastTimestamp:2026-03-16 00:06:32.579541989 +0000 UTC m=+1.179640459,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.173607 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f571baf0f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.591814415 +0000 UTC m=+1.191912885,LastTimestamp:2026-03-16 00:06:32.591814415 +0000 UTC m=+1.191912885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.180851 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299f5768971c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.596854556 +0000 UTC m=+1.196953016,LastTimestamp:2026-03-16 00:06:32.596854556 +0000 UTC m=+1.196953016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.185006 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299f5945840c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.628110348 +0000 UTC m=+1.228208818,LastTimestamp:2026-03-16 00:06:32.628110348 +0000 UTC m=+1.228208818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.187995 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f7ad72767 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.191303015 +0000 UTC m=+1.791401465,LastTimestamp:2026-03-16 00:06:33.191303015 +0000 UTC m=+1.791401465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.192797 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299f7af1a144 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.193038148 +0000 UTC m=+1.793136588,LastTimestamp:2026-03-16 00:06:33.193038148 +0000 UTC m=+1.793136588,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.195479 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299f7b5ea7f8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.200183288 +0000 UTC m=+1.800281718,LastTimestamp:2026-03-16 00:06:33.200183288 +0000 UTC m=+1.800281718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.200108 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f7b6ee117 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.201246487 +0000 UTC m=+1.801344917,LastTimestamp:2026-03-16 00:06:33.201246487 +0000 UTC m=+1.801344917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.202395 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299f7b7c8c08 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.202142216 +0000 UTC m=+1.802240656,LastTimestamp:2026-03-16 00:06:33.202142216 +0000 UTC m=+1.802240656,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.209178 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f7b92600e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.20357275 +0000 UTC m=+1.803671180,LastTimestamp:2026-03-16 00:06:33.20357275 +0000 UTC m=+1.803671180,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.216512 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d299f7b9ce56f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.204262255 +0000 UTC m=+1.804360685,LastTimestamp:2026-03-16 00:06:33.204262255 +0000 UTC m=+1.804360685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.223077 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299f7c503e91 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.216016017 +0000 UTC m=+1.816114457,LastTimestamp:2026-03-16 00:06:33.216016017 +0000 UTC m=+1.816114457,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.229673 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299f7cf60644 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.22688058 +0000 UTC m=+1.826979010,LastTimestamp:2026-03-16 00:06:33.22688058 +0000 UTC m=+1.826979010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.236498 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299f7d1db429 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.229481001 +0000 UTC m=+1.829579441,LastTimestamp:2026-03-16 00:06:33.229481001 +0000 UTC m=+1.829579441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.245060 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d299f7d24a955 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.229936981 +0000 UTC m=+1.830035411,LastTimestamp:2026-03-16 00:06:33.229936981 +0000 UTC m=+1.830035411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.252827 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f8d2fae19 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.499094553 +0000 UTC m=+2.099193013,LastTimestamp:2026-03-16 00:06:33.499094553 +0000 UTC m=+2.099193013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.259852 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f8e097a06 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.51336807 +0000 UTC m=+2.113466530,LastTimestamp:2026-03-16 00:06:33.51336807 +0000 UTC m=+2.113466530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.266038 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f8e338e37 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.516125751 +0000 UTC m=+2.116224221,LastTimestamp:2026-03-16 00:06:33.516125751 +0000 UTC m=+2.116224221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.272981 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f9ab64ff6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.726021622 +0000 UTC m=+2.326120052,LastTimestamp:2026-03-16 00:06:33.726021622 +0000 UTC m=+2.326120052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.280804 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f9b2428c0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.733220544 +0000 UTC m=+2.333318974,LastTimestamp:2026-03-16 00:06:33.733220544 +0000 UTC m=+2.333318974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.288200 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f9b41e782 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.735169922 +0000 UTC m=+2.335268352,LastTimestamp:2026-03-16 00:06:33.735169922 +0000 UTC m=+2.335268352,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.294994 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299fa7a2c5eb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.942844907 +0000 UTC m=+2.542943327,LastTimestamp:2026-03-16 00:06:33.942844907 +0000 UTC m=+2.542943327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.300457 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.300710 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.300795 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299fa838d249 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.952678473 +0000 UTC m=+2.552776913,LastTimestamp:2026-03-16 00:06:33.952678473 +0000 UTC m=+2.552776913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.306078 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.306153 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.306177 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.307845 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.308136 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.313044 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299fb177eb9a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.107808666 +0000 UTC m=+2.707907096,LastTimestamp:2026-03-16 00:06:34.107808666 +0000 UTC m=+2.707907096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.320844 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d299fb1e622e5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.115031781 +0000 UTC m=+2.715130241,LastTimestamp:2026-03-16 00:06:34.115031781 +0000 UTC m=+2.715130241,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.325500 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fb1fadfb5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.116390837 +0000 UTC m=+2.716489277,LastTimestamp:2026-03-16 00:06:34.116390837 +0000 UTC m=+2.716489277,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.330414 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fb1fb26b9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.116409017 +0000 UTC m=+2.716507457,LastTimestamp:2026-03-16 00:06:34.116409017 +0000 UTC m=+2.716507457,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.336034 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d299fc04ac634 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.356508212 +0000 UTC m=+2.956606642,LastTimestamp:2026-03-16 00:06:34.356508212 +0000 UTC m=+2.956606642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.341508 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fc0949d12 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.361347346 +0000 UTC m=+2.961445776,LastTimestamp:2026-03-16 00:06:34.361347346 +0000 UTC m=+2.961445776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.347332 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299fc0c53272 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.364531314 +0000 UTC m=+2.964629744,LastTimestamp:2026-03-16 00:06:34.364531314 +0000 UTC m=+2.964629744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.353234 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fc0efb564 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.367317348 +0000 UTC m=+2.967415778,LastTimestamp:2026-03-16 00:06:34.367317348 +0000 UTC m=+2.967415778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.358390 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d299fc19885e5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.378380773 +0000 UTC m=+2.978479203,LastTimestamp:2026-03-16 00:06:34.378380773 +0000 UTC m=+2.978479203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.364885 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fc1eda1f9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.383958521 +0000 UTC m=+2.984056951,LastTimestamp:2026-03-16 00:06:34.383958521 +0000 UTC m=+2.984056951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.369661 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fc2003c35 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.385177653 +0000 UTC m=+2.985276083,LastTimestamp:2026-03-16 00:06:34.385177653 +0000 UTC m=+2.985276083,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.373903 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299fc2a427da openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.395920346 +0000 UTC m=+2.996018776,LastTimestamp:2026-03-16 00:06:34.395920346 +0000 UTC m=+2.996018776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.378584 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fc2c091bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.397782459 +0000 UTC m=+2.997880889,LastTimestamp:2026-03-16 00:06:34.397782459 +0000 UTC m=+2.997880889,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.384477 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fc335156c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.405418348 +0000 UTC m=+3.005516788,LastTimestamp:2026-03-16 00:06:34.405418348 +0000 UTC m=+3.005516788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.390077 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fce6b26ca openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.593511114 +0000 UTC m=+3.193609544,LastTimestamp:2026-03-16 00:06:34.593511114 +0000 UTC m=+3.193609544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.395653 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fce71a0bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.593935548 +0000 UTC m=+3.194033988,LastTimestamp:2026-03-16 00:06:34.593935548 +0000 UTC m=+3.194033988,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.401285 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fcf16299d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.604718493 +0000 UTC m=+3.204816923,LastTimestamp:2026-03-16 00:06:34.604718493 +0000 UTC m=+3.204816923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.406227 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fcf24b915 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.605672725 +0000 UTC m=+3.205771155,LastTimestamp:2026-03-16 00:06:34.605672725 +0000 UTC m=+3.205771155,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.411311 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fcf474432 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.607936562 +0000 UTC m=+3.208034992,LastTimestamp:2026-03-16 00:06:34.607936562 +0000 UTC m=+3.208034992,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.416593 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fcf58d043 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.609086531 +0000 UTC m=+3.209184961,LastTimestamp:2026-03-16 00:06:34.609086531 +0000 UTC m=+3.209184961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.423050 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fdc11939a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.822521754 +0000 UTC m=+3.422620194,LastTimestamp:2026-03-16 00:06:34.822521754 +0000 UTC m=+3.422620194,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.429777 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fdc1210b0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.822553776 +0000 UTC m=+3.422652206,LastTimestamp:2026-03-16 00:06:34.822553776 +0000 UTC m=+3.422652206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.435327 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fdcb3a428 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.833142824 +0000 UTC m=+3.433241254,LastTimestamp:2026-03-16 00:06:34.833142824 +0000 UTC m=+3.433241254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.442079 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fdd0916ba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.838742714 +0000 UTC m=+3.438841144,LastTimestamp:2026-03-16 00:06:34.838742714 +0000 UTC m=+3.438841144,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.447098 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fdd1d19b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.840054198 +0000 UTC m=+3.440152628,LastTimestamp:2026-03-16 00:06:34.840054198 +0000 UTC m=+3.440152628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.452147 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fe70bdbf6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.006696438 +0000 UTC m=+3.606794878,LastTimestamp:2026-03-16 00:06:35.006696438 +0000 UTC m=+3.606794878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.458476 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fe7bb9fb4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.018215348 +0000 UTC m=+3.618313778,LastTimestamp:2026-03-16 00:06:35.018215348 +0000 UTC m=+3.618313778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.464565 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fe7ca13ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.01916254 +0000 UTC m=+3.619260980,LastTimestamp:2026-03-16 00:06:35.01916254 +0000 UTC m=+3.619260980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.471187 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299fee514728 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.128686376 +0000 UTC m=+3.728784806,LastTimestamp:2026-03-16 00:06:35.128686376 +0000 UTC m=+3.728784806,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.477555 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299ff408769c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.224577692 +0000 UTC m=+3.824676122,LastTimestamp:2026-03-16 00:06:35.224577692 +0000 UTC m=+3.824676122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.483691 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299ff4ee60bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.239645372 +0000 UTC m=+3.839743812,LastTimestamp:2026-03-16 00:06:35.239645372 +0000 UTC m=+3.839743812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.490641 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299ffaa912d9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.335766745 +0000 UTC m=+3.935865175,LastTimestamp:2026-03-16 00:06:35.335766745 +0000 UTC m=+3.935865175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.496013 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299ffb9eafdc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.35186326 +0000 UTC m=+3.951961690,LastTimestamp:2026-03-16 00:06:35.35186326 +0000 UTC m=+3.951961690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.502882 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a02b877192 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.155646354 +0000 UTC m=+4.755744814,LastTimestamp:2026-03-16 00:06:36.155646354 +0000 UTC m=+4.755744814,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.509531 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a03ad871e3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.412613091 +0000 UTC m=+5.012711531,LastTimestamp:2026-03-16 00:06:36.412613091 +0000 UTC m=+5.012711531,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.514829 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a03ba5fcb3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.426083507 +0000 UTC m=+5.026181937,LastTimestamp:2026-03-16 00:06:36.426083507 +0000 UTC m=+5.026181937,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.519629 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a03bb8133f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.427268927 +0000 UTC m=+5.027367357,LastTimestamp:2026-03-16 00:06:36.427268927 +0000 UTC m=+5.027367357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.523857 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a04b329398 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.686955416 +0000 UTC m=+5.287053846,LastTimestamp:2026-03-16 00:06:36.686955416 +0000 UTC m=+5.287053846,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.529132 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a04c514f9a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.705746842 +0000 UTC m=+5.305845302,LastTimestamp:2026-03-16 00:06:36.705746842 +0000 UTC m=+5.305845302,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.534209 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a04c6faebf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.707737279 +0000 UTC m=+5.307835739,LastTimestamp:2026-03-16 00:06:36.707737279 +0000 UTC m=+5.307835739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.539120 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a05ccca4d0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.98226504 +0000 UTC m=+5.582363500,LastTimestamp:2026-03-16 00:06:36.98226504 +0000 UTC m=+5.582363500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.543696 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a05d827a02 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.994181634 +0000 UTC m=+5.594280094,LastTimestamp:2026-03-16 00:06:36.994181634 +0000 UTC m=+5.594280094,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.549003 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a05d997feb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.995690475 +0000 UTC m=+5.595788905,LastTimestamp:2026-03-16 00:06:36.995690475 +0000 UTC m=+5.595788905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.554408 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a06e45ff24 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:37.275430692 +0000 UTC m=+5.875529112,LastTimestamp:2026-03-16 00:06:37.275430692 +0000 UTC m=+5.875529112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.559151 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a06f81c9c3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:37.296126403 +0000 UTC m=+5.896224833,LastTimestamp:2026-03-16 00:06:37.296126403 +0000 UTC m=+5.896224833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.563962 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a06f9894ba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:37.297620154 +0000 UTC m=+5.897718584,LastTimestamp:2026-03-16 00:06:37.297620154 +0000 UTC m=+5.897718584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.569220 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a07d13e65d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:37.523805789 +0000 UTC m=+6.123904229,LastTimestamp:2026-03-16 00:06:37.523805789 +0000 UTC m=+6.123904229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.573608 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a07dc7e63e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:37.535602238 +0000 UTC m=+6.135700678,LastTimestamp:2026-03-16 00:06:37.535602238 +0000 UTC m=+6.135700678,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.583230 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:06:56 crc kubenswrapper[4983]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29a2500d0186 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 16 00:06:56 crc kubenswrapper[4983]: body: Mar 16 00:06:56 crc kubenswrapper[4983]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.358313862 +0000 UTC m=+13.958412292,LastTimestamp:2026-03-16 00:06:45.358313862 +0000 UTC m=+13.958412292,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:06:56 crc kubenswrapper[4983]: > Mar 16 00:06:56 crc kubenswrapper[4983]: W0316 00:06:56.588745 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.588856 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.588831 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a2500dc87a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.358364794 +0000 UTC m=+13.958463224,LastTimestamp:2026-03-16 00:06:45.358364794 +0000 UTC m=+13.958463224,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.595844 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 16 00:06:56 crc kubenswrapper[4983]: &Event{ObjectMeta:{kube-apiserver-crc.189d29a2762f9bde openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 16 00:06:56 crc kubenswrapper[4983]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:06:56 crc kubenswrapper[4983]: Mar 16 00:06:56 crc kubenswrapper[4983]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.998115806 +0000 UTC m=+14.598214246,LastTimestamp:2026-03-16 00:06:45.998115806 +0000 UTC m=+14.598214246,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:06:56 crc kubenswrapper[4983]: > Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.602686 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a276305972 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.998164338 +0000 UTC m=+14.598262768,LastTimestamp:2026-03-16 00:06:45.998164338 +0000 UTC m=+14.598262768,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.610863 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d29a2762f9bde\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 16 00:06:56 crc kubenswrapper[4983]: &Event{ObjectMeta:{kube-apiserver-crc.189d29a2762f9bde openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 16 00:06:56 crc kubenswrapper[4983]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:06:56 crc kubenswrapper[4983]: Mar 16 00:06:56 crc kubenswrapper[4983]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.998115806 +0000 UTC m=+14.598214246,LastTimestamp:2026-03-16 00:06:46.004165431 +0000 UTC m=+14.604263851,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:06:56 crc kubenswrapper[4983]: > Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.618628 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d29a276305972\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a276305972 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.998164338 +0000 UTC m=+14.598262768,LastTimestamp:2026-03-16 00:06:46.004201442 +0000 UTC m=+14.604299872,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.624308 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d299fe7ca13ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fe7ca13ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.01916254 +0000 UTC m=+3.619260980,LastTimestamp:2026-03-16 00:06:46.206131756 +0000 UTC m=+14.806230196,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.629738 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d299ff408769c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299ff408769c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.224577692 +0000 UTC m=+3.824676122,LastTimestamp:2026-03-16 00:06:46.405514264 +0000 UTC m=+15.005612704,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.635866 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d299ff4ee60bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299ff4ee60bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.239645372 +0000 UTC m=+3.839743812,LastTimestamp:2026-03-16 00:06:46.413881148 +0000 UTC m=+15.013979598,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.644791 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a2500d0186\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:06:56 crc kubenswrapper[4983]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29a2500d0186 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 16 00:06:56 crc kubenswrapper[4983]: body: Mar 16 00:06:56 crc kubenswrapper[4983]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.358313862 +0000 UTC m=+13.958412292,LastTimestamp:2026-03-16 00:06:55.359600166 +0000 UTC m=+23.959698636,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:06:56 crc kubenswrapper[4983]: > Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.649874 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a2500dc87a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a2500dc87a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.358364794 +0000 UTC m=+13.958463224,LastTimestamp:2026-03-16 00:06:55.359685718 +0000 UTC m=+23.959784198,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.906853 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:57 crc kubenswrapper[4983]: I0316 00:06:57.032171 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:57 crc kubenswrapper[4983]: I0316 00:06:57.245183 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:57 crc kubenswrapper[4983]: I0316 00:06:57.246937 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:57 crc kubenswrapper[4983]: I0316 00:06:57.247015 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:57 crc kubenswrapper[4983]: I0316 00:06:57.247041 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:57 crc kubenswrapper[4983]: I0316 00:06:57.248026 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.033812 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.255928 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.260310 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776"} Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.260605 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.262155 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.262222 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.262245 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.034959 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:59 crc kubenswrapper[4983]: W0316 00:06:59.142508 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 16 00:06:59 crc kubenswrapper[4983]: E0316 00:06:59.142606 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.266669 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.268060 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.270902 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" exitCode=255 Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.270973 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776"} Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.271072 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.271267 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.272687 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.272799 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.272828 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.274221 4983 scope.go:117] "RemoveContainer" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" Mar 16 00:06:59 crc kubenswrapper[4983]: E0316 00:06:59.274717 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.416141 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.419327 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.419413 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.419431 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.419463 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:59 crc kubenswrapper[4983]: E0316 00:06:59.422696 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:06:59 crc kubenswrapper[4983]: E0316 00:06:59.423189 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:00 crc kubenswrapper[4983]: I0316 00:07:00.032089 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:00 crc kubenswrapper[4983]: I0316 00:07:00.275813 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:01 crc kubenswrapper[4983]: I0316 00:07:01.030124 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:01 crc kubenswrapper[4983]: W0316 00:07:01.116446 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 16 00:07:01 crc kubenswrapper[4983]: E0316 00:07:01.116500 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:02 crc kubenswrapper[4983]: I0316 00:07:02.033179 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:02 crc kubenswrapper[4983]: E0316 00:07:02.168009 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:02 crc kubenswrapper[4983]: W0316 00:07:02.545712 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:02 crc kubenswrapper[4983]: E0316 00:07:02.546096 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:03 crc kubenswrapper[4983]: I0316 00:07:03.030562 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.032824 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.398237 4983 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:55000->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.398338 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:55000->192.168.126.11:10357: read: connection reset by peer" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.398434 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.398660 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.400531 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.400595 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.400621 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.401485 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.401866 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272" gracePeriod=30 Mar 16 00:07:04 crc kubenswrapper[4983]: E0316 00:07:04.407607 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:07:04 crc kubenswrapper[4983]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29a6beec563f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:55000->192.168.126.11:10357: read: connection reset by peer Mar 16 00:07:04 crc kubenswrapper[4983]: body: Mar 16 00:07:04 crc kubenswrapper[4983]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:04.398313023 +0000 UTC m=+32.998411493,LastTimestamp:2026-03-16 00:07:04.398313023 +0000 UTC m=+32.998411493,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:04 crc kubenswrapper[4983]: > Mar 16 00:07:04 crc kubenswrapper[4983]: E0316 00:07:04.410032 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a6beed6bb6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:55000->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:04.398384054 +0000 UTC m=+32.998482524,LastTimestamp:2026-03-16 00:07:04.398384054 +0000 UTC m=+32.998482524,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:04 crc kubenswrapper[4983]: E0316 00:07:04.413312 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a6bf2205a2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:04.40183133 +0000 UTC m=+33.001929840,LastTimestamp:2026-03-16 00:07:04.40183133 +0000 UTC m=+33.001929840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:04 crc kubenswrapper[4983]: E0316 00:07:04.421264 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d299f7b92600e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f7b92600e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.20357275 +0000 UTC m=+1.803671180,LastTimestamp:2026-03-16 00:07:04.420139925 +0000 UTC m=+33.020238395,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:04 crc kubenswrapper[4983]: E0316 00:07:04.605998 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d299f8d2fae19\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f8d2fae19 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.499094553 +0000 UTC m=+2.099193013,LastTimestamp:2026-03-16 00:07:04.600028291 +0000 UTC m=+33.200126721,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:04 crc kubenswrapper[4983]: E0316 00:07:04.614142 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d299f8e097a06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f8e097a06 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.51336807 +0000 UTC m=+2.113466530,LastTimestamp:2026-03-16 00:07:04.610078051 +0000 UTC m=+33.210176501,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.032278 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.294359 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.294993 4983 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272" exitCode=255 Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.295041 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272"} Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.295086 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907"} Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.295211 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.296442 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.296507 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.296525 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.031656 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.300951 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.302134 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.303554 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.303628 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.303653 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.304645 4983 scope.go:117] "RemoveContainer" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" Mar 16 00:07:06 crc kubenswrapper[4983]: E0316 00:07:06.305017 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.423209 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.425130 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.425178 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.425191 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.425222 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:06 crc kubenswrapper[4983]: E0316 00:07:06.431941 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:06 crc kubenswrapper[4983]: E0316 00:07:06.431979 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.907239 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.034280 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.300203 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.301109 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.301157 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.301194 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.301881 4983 scope.go:117] "RemoveContainer" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" Mar 16 00:07:07 crc kubenswrapper[4983]: E0316 00:07:07.302103 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.312791 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.312902 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.313687 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.313875 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.314006 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:08 crc kubenswrapper[4983]: I0316 00:07:08.031084 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:09 crc kubenswrapper[4983]: I0316 00:07:09.033968 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:10 crc kubenswrapper[4983]: I0316 00:07:10.029967 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:11 crc kubenswrapper[4983]: I0316 00:07:11.031353 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:12 crc kubenswrapper[4983]: I0316 00:07:12.030444 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:12 crc kubenswrapper[4983]: E0316 00:07:12.168138 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:12 crc kubenswrapper[4983]: I0316 00:07:12.358235 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:12 crc kubenswrapper[4983]: I0316 00:07:12.358527 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:12 crc kubenswrapper[4983]: I0316 00:07:12.360262 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:12 crc kubenswrapper[4983]: I0316 00:07:12.360334 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:12 crc kubenswrapper[4983]: I0316 00:07:12.360356 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:13 crc kubenswrapper[4983]: I0316 00:07:13.032382 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:13 crc kubenswrapper[4983]: I0316 00:07:13.432559 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:13 crc kubenswrapper[4983]: I0316 00:07:13.434039 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:13 crc kubenswrapper[4983]: I0316 00:07:13.434102 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:13 crc kubenswrapper[4983]: I0316 00:07:13.434127 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:13 crc kubenswrapper[4983]: I0316 00:07:13.434169 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:13 crc kubenswrapper[4983]: E0316 00:07:13.437285 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:13 crc kubenswrapper[4983]: E0316 00:07:13.438491 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:14 crc kubenswrapper[4983]: I0316 00:07:14.033655 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:14 crc kubenswrapper[4983]: W0316 00:07:14.975957 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 16 00:07:14 crc kubenswrapper[4983]: E0316 00:07:14.976025 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.032136 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.131309 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.131516 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.134228 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.134258 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.134268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.136953 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.320041 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.320934 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.320964 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.320973 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:16 crc kubenswrapper[4983]: I0316 00:07:16.030638 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:17 crc kubenswrapper[4983]: I0316 00:07:17.034545 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:18 crc kubenswrapper[4983]: W0316 00:07:18.003508 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:18 crc kubenswrapper[4983]: E0316 00:07:18.003948 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:18 crc kubenswrapper[4983]: I0316 00:07:18.031246 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:18 crc kubenswrapper[4983]: I0316 00:07:18.092125 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:18 crc kubenswrapper[4983]: I0316 00:07:18.093601 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:18 crc kubenswrapper[4983]: I0316 00:07:18.093877 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:18 crc kubenswrapper[4983]: I0316 00:07:18.094089 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:18 crc kubenswrapper[4983]: I0316 00:07:18.095286 4983 scope.go:117] "RemoveContainer" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" Mar 16 00:07:18 crc kubenswrapper[4983]: E0316 00:07:18.095842 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:18 crc kubenswrapper[4983]: W0316 00:07:18.284971 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 16 00:07:18 crc kubenswrapper[4983]: E0316 00:07:18.285024 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:19 crc kubenswrapper[4983]: I0316 00:07:19.032376 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:20 crc kubenswrapper[4983]: I0316 00:07:20.027707 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:20 crc kubenswrapper[4983]: I0316 00:07:20.437888 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:20 crc kubenswrapper[4983]: I0316 00:07:20.439603 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:20 crc kubenswrapper[4983]: I0316 00:07:20.439636 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:20 crc kubenswrapper[4983]: I0316 00:07:20.439646 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:20 crc kubenswrapper[4983]: I0316 00:07:20.439670 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:20 crc kubenswrapper[4983]: E0316 00:07:20.445608 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:20 crc kubenswrapper[4983]: E0316 00:07:20.445881 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:21 crc kubenswrapper[4983]: I0316 00:07:21.031448 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:22 crc kubenswrapper[4983]: I0316 00:07:22.030781 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:22 crc kubenswrapper[4983]: E0316 00:07:22.168276 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:23 crc kubenswrapper[4983]: I0316 00:07:23.030929 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:24 crc kubenswrapper[4983]: I0316 00:07:24.030506 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:25 crc kubenswrapper[4983]: I0316 00:07:25.031715 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:26 crc kubenswrapper[4983]: I0316 00:07:26.033213 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:26 crc kubenswrapper[4983]: W0316 00:07:26.387338 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 16 00:07:26 crc kubenswrapper[4983]: E0316 00:07:26.387407 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.022845 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.023429 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.025021 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.025058 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.025071 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.033610 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.446282 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.448161 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.448227 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.448250 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.448295 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:27 crc kubenswrapper[4983]: E0316 00:07:27.450609 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:27 crc kubenswrapper[4983]: E0316 00:07:27.450682 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:28 crc kubenswrapper[4983]: I0316 00:07:28.033248 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:29 crc kubenswrapper[4983]: I0316 00:07:29.034975 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:30 crc kubenswrapper[4983]: I0316 00:07:30.034152 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:31 crc kubenswrapper[4983]: I0316 00:07:31.035835 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.037960 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.091681 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.093260 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.093665 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.093687 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.094687 4983 scope.go:117] "RemoveContainer" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" Mar 16 00:07:32 crc kubenswrapper[4983]: E0316 00:07:32.168379 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.367545 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.369362 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd"} Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.369545 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.370479 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.370564 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.370622 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.030716 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.374229 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.376134 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.378734 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" exitCode=255 Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.378825 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd"} Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.378877 4983 scope.go:117] "RemoveContainer" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.379090 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.380527 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.380578 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.380595 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.381469 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:07:33 crc kubenswrapper[4983]: E0316 00:07:33.381879 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.037566 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.385178 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.451159 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.452174 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.452239 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.452258 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.452299 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:34 crc kubenswrapper[4983]: E0316 00:07:34.458506 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:34 crc kubenswrapper[4983]: E0316 00:07:34.458736 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:35 crc kubenswrapper[4983]: I0316 00:07:35.036157 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.031973 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.300800 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.301123 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.302896 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.302948 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.302967 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.303848 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:07:36 crc kubenswrapper[4983]: E0316 00:07:36.304153 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.540850 4983 csr.go:261] certificate signing request csr-tjcxq is approved, waiting to be issued Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.551877 4983 csr.go:257] certificate signing request csr-tjcxq is issued Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.652463 4983 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.871143 4983 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.906910 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.907506 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.909865 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.909970 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.910250 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.911113 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:07:36 crc kubenswrapper[4983]: E0316 00:07:36.911448 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:37 crc kubenswrapper[4983]: I0316 00:07:37.553523 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-18 11:41:22.468164912 +0000 UTC Mar 16 00:07:37 crc kubenswrapper[4983]: I0316 00:07:37.554464 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5939h33m44.913709387s for next certificate rotation Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.459821 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.461114 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.461202 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.461221 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.461323 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.472553 4983 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.472844 4983 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.472876 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.476059 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.476116 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.476133 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.476159 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.476177 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:41Z","lastTransitionTime":"2026-03-16T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.499103 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.505664 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.505696 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.505707 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.505724 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.505736 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:41Z","lastTransitionTime":"2026-03-16T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.515200 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.524969 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.525004 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.525016 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.525034 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.525047 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:41Z","lastTransitionTime":"2026-03-16T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.540073 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.547521 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.547543 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.547552 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.547568 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.547580 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:41Z","lastTransitionTime":"2026-03-16T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.560419 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.560544 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.560570 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.661226 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.762123 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.862663 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.963735 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.063984 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.164296 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.168479 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.264426 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.365536 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.466697 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.567047 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.668140 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.768524 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.869670 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.970154 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.071260 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.172385 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.273574 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.374249 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.474486 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.575546 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.675716 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.776427 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.877485 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.978052 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.078685 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.179420 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.280580 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.380689 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.481597 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.582111 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.683194 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.783325 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.883529 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.984120 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.084577 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.185635 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.286459 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.387469 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.488624 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.589646 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.689853 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.790274 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.890839 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.991881 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.092154 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.192315 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.293448 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.393607 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.494205 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.594567 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.694842 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.795680 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.896696 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.997321 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.098362 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.198478 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.299514 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.400614 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.501277 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.601835 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.702961 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.804134 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.905016 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.005165 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.106242 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.207344 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.307815 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.408827 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.509887 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.610137 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.710303 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.811372 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.911906 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.012072 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.113092 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.213246 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.314349 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.414898 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.515460 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.616530 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.717390 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.818462 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.919569 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.020588 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: I0316 00:07:50.092237 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:50 crc kubenswrapper[4983]: I0316 00:07:50.093775 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:50 crc kubenswrapper[4983]: I0316 00:07:50.093826 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:50 crc kubenswrapper[4983]: I0316 00:07:50.093836 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:50 crc kubenswrapper[4983]: I0316 00:07:50.094409 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.094602 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.121395 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.221797 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.322452 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.423438 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.524131 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.624456 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.725543 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.825872 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.926398 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.026861 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.127852 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.228530 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.329068 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.429794 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.530473 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.630610 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.731182 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.737977 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.738054 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.738079 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.738110 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.738136 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:51Z","lastTransitionTime":"2026-03-16T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.756510 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.765489 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.765538 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.765556 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.765582 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.765601 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:51Z","lastTransitionTime":"2026-03-16T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.783585 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.794164 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.794222 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.794241 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.794265 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.794283 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:51Z","lastTransitionTime":"2026-03-16T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.810172 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.817651 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.817691 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.817702 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.817720 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.817731 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:51Z","lastTransitionTime":"2026-03-16T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.827881 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.828015 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.828033 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.928805 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.029254 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.130234 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.169624 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.230941 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.331736 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.432194 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.532469 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.633568 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.733744 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.834971 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: I0316 00:07:52.843746 4983 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 16 00:07:52 crc kubenswrapper[4983]: I0316 00:07:52.938168 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:52 crc kubenswrapper[4983]: I0316 00:07:52.938225 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:52 crc kubenswrapper[4983]: I0316 00:07:52.938243 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:52 crc kubenswrapper[4983]: I0316 00:07:52.938267 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:52 crc kubenswrapper[4983]: I0316 00:07:52.938285 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:52Z","lastTransitionTime":"2026-03-16T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.012453 4983 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.041697 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.041789 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.041805 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.041827 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.041874 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.047937 4983 apiserver.go:52] "Watching apiserver" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.056462 4983 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.057908 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf","openshift-ovn-kubernetes/ovnkube-node-wsfb4","openshift-machine-config-operator/machine-config-daemon-7sbnj","openshift-multus/network-metrics-daemon-qvtjp","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-operator/iptables-alerter-4ln5h","openshift-dns/node-resolver-v748m","openshift-multus/multus-tqncp","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-image-registry/node-ca-d2h5k","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-multus/multus-additional-cni-plugins-pp6bs"] Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.059984 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.060447 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.060552 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.060650 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.060782 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.060853 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.060934 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.060993 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.061272 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.062134 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.062474 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.063061 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.063540 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.063583 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.063645 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.064208 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.065705 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.066221 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.066436 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.066739 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.067108 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.068380 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.070945 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.070921 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.073622 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.074103 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.074209 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.074468 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.074641 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.074956 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.075127 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.075370 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.075468 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.075543 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.075408 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.075923 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.076522 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.076741 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.076984 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.077177 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.077380 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.077492 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.077662 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.077915 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078072 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078225 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078332 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078437 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078342 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078641 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078377 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.079685 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.079748 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.080067 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.080114 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.093448 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.109830 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.122000 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.135549 4983 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.137574 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.144362 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.144401 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.144420 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.144440 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.144455 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.152559 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161379 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161417 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161439 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161459 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161479 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161500 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161543 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161563 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161584 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161688 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161712 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161732 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161865 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161890 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161909 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161931 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161950 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161976 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161997 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162017 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162038 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162058 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162077 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162097 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.162152 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:07:53.662102549 +0000 UTC m=+82.262201019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162226 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162297 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162332 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162519 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162620 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162690 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162738 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162833 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162884 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163012 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163015 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163084 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163188 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163246 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163272 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163302 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163352 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163379 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163405 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163432 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163463 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163484 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163516 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163568 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163617 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163670 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163742 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163862 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163920 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163974 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164022 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164065 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164069 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164144 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164167 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164189 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164212 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164235 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164231 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164256 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164279 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164299 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164253 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164358 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164408 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164451 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164473 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164496 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164519 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164696 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164741 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164789 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164825 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164847 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164837 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164868 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164888 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164908 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164929 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164952 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164972 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164992 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165011 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165033 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165038 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165052 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165054 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165074 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165167 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165225 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165432 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165497 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165589 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165636 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165694 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165723 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165751 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165816 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165841 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165894 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165942 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165996 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166045 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166099 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166109 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166148 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166215 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166278 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166330 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166383 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166435 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166492 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166542 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166592 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166643 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166694 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166750 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166853 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166908 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166959 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167010 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167064 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.168926 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.168987 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169029 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169267 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169334 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169387 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169434 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169488 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169588 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169647 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169701 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169797 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169869 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169922 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169973 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170031 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170088 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170143 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170197 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170252 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170294 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170332 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170373 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170429 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170465 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170502 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170549 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170602 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171439 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171521 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171560 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171627 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171660 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171731 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172166 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172188 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172205 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172535 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172551 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172587 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172605 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172624 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172642 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172682 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172699 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172716 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172760 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172778 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172795 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172811 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172850 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172867 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172883 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172901 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172933 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172951 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172967 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173006 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173023 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173041 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173058 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173148 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173172 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173261 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173283 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173301 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173316 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173351 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173369 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173387 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173419 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173436 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173454 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173469 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173550 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173597 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173687 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173705 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173721 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174421 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174468 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174509 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174613 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174651 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174688 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174724 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174858 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174953 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trbxh\" (UniqueName: \"kubernetes.io/projected/43da17ff-aed1-44a2-a154-6800c3dd6ca9-kube-api-access-trbxh\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174993 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-kubelet\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175032 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-system-cni-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175086 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f81ec143-6c51-4f96-ae71-a4759bac7c70-cni-binary-copy\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175133 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-conf-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175187 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/48a48757-a3b8-4d4d-92ba-6a2459a26a86-mcd-auth-proxy-config\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175236 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-node-log\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175284 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-os-release\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175330 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/48a48757-a3b8-4d4d-92ba-6a2459a26a86-rootfs\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175377 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6740e33-489f-4f45-b3e5-fdceaebf4301-hosts-file\") pod \"node-resolver-v748m\" (UID: \"b6740e33-489f-4f45-b3e5-fdceaebf4301\") " pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175424 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-systemd\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175475 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48a48757-a3b8-4d4d-92ba-6a2459a26a86-proxy-tls\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175520 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-slash\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175562 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175604 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-netns\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175650 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175700 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92wtj\" (UniqueName: \"kubernetes.io/projected/9138d88d-b777-4cab-b3d2-2099f01b205b-kube-api-access-92wtj\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175799 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175851 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-kubelet\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175909 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43da17ff-aed1-44a2-a154-6800c3dd6ca9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175975 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176025 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-systemd-units\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176073 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-bin\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176116 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-log-socket\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176148 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-env-overrides\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176194 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-system-cni-dir\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176241 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9138d88d-b777-4cab-b3d2-2099f01b205b-cni-binary-copy\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176291 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43da17ff-aed1-44a2-a154-6800c3dd6ca9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176339 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9138d88d-b777-4cab-b3d2-2099f01b205b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176473 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176528 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-multus-certs\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176578 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l24hn\" (UniqueName: \"kubernetes.io/projected/6993dda4-ac10-47af-b406-d49d7781fbe5-kube-api-access-l24hn\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176626 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-cni-multus\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176673 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w6zq\" (UniqueName: \"kubernetes.io/projected/48a48757-a3b8-4d4d-92ba-6a2459a26a86-kube-api-access-5w6zq\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176720 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43da17ff-aed1-44a2-a154-6800c3dd6ca9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176803 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-etc-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176863 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176917 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177034 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-socket-dir-parent\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177090 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-daemon-config\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177136 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjz9m\" (UniqueName: \"kubernetes.io/projected/f81ec143-6c51-4f96-ae71-a4759bac7c70-kube-api-access-gjz9m\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177184 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-ovn-kubernetes\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177226 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovn-node-metrics-cert\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166437 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166474 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166470 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166838 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166842 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166822 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166865 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167315 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167373 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167576 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167637 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167810 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167980 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.168202 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.168235 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.168480 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.168504 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169121 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169521 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169659 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169729 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169722 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169894 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169994 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170012 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170042 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170638 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170793 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171079 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171123 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171348 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171499 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172380 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172604 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173406 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173487 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174035 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174278 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174340 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174348 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174355 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174806 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174972 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175451 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175458 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175715 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175797 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175481 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175915 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176070 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176287 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176372 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176560 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177116 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.178091 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.178220 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.178851 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.179019 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.179035 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.179082 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.179432 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.179574 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180037 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180289 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180293 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180442 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180617 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180829 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180879 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.181132 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.181231 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.181245 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.181529 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.181621 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.181934 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.182390 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.182529 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.182833 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.182867 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.183284 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.183436 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.183963 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.184007 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.184145 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.184424 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.184871 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.184914 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185038 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185060 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185156 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185265 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185353 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185075 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185540 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185695 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185924 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166704 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.186092 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.186554 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.186572 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.186646 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.186923 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.187074 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.187498 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.187585 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.187617 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.189304 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.189395 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:53.689370335 +0000 UTC m=+82.289468805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.190233 4983 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.202716 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203034 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203178 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203520 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203580 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203604 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203601 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203637 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203691 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177276 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-cnibin\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203904 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203985 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-cni-bin\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.204043 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.204052 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.204116 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-etc-kubernetes\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.204322 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/607f8329-b349-45da-bb9b-785740b4ad4f-serviceca\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.204371 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-netd\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.205795 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.205862 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.205921 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206059 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206210 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-netns\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206387 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-script-lib\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206433 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-cnibin\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206479 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-hostroot\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206523 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-var-lib-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206571 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-config\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206621 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206860 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206911 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206955 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207001 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-ovn\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207031 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-os-release\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207065 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-k8s-cni-cncf-io\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207101 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g8sn\" (UniqueName: \"kubernetes.io/projected/b6740e33-489f-4f45-b3e5-fdceaebf4301-kube-api-access-6g8sn\") pod \"node-resolver-v748m\" (UID: \"b6740e33-489f-4f45-b3e5-fdceaebf4301\") " pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207137 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207172 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88s5k\" (UniqueName: \"kubernetes.io/projected/f055dad5-7c9b-46a1-a715-34847c30d0cf-kube-api-access-88s5k\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207214 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207253 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-cni-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207296 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207332 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/607f8329-b349-45da-bb9b-785740b4ad4f-host\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207355 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207387 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207422 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncxpw\" (UniqueName: \"kubernetes.io/projected/607f8329-b349-45da-bb9b-785740b4ad4f-kube-api-access-ncxpw\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207563 4983 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207581 4983 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207596 4983 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207610 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207629 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207643 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207659 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207672 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207690 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207704 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207718 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207735 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207749 4983 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207791 4983 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207811 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207837 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207856 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207875 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207892 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207918 4983 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207937 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207958 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207977 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208000 4983 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208017 4983 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208035 4983 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208157 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208179 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208197 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208214 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208237 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208328 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208351 4983 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208376 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208402 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208424 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208527 4983 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208925 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208950 4983 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208969 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208987 4983 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209011 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209032 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209050 4983 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209068 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209094 4983 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209113 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209130 4983 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209147 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209171 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209192 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209210 4983 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209234 4983 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209251 4983 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209269 4983 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209288 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209311 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209330 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209347 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209366 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209388 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209406 4983 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209424 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209447 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209465 4983 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209519 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209537 4983 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209560 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209578 4983 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209597 4983 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209616 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209638 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209656 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209671 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209687 4983 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209709 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209725 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209741 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209790 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209809 4983 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209824 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209843 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209865 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209880 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209895 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209920 4983 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209946 4983 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209962 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209979 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209995 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210015 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210031 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210048 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210071 4983 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210089 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210106 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210121 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210144 4983 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210161 4983 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210178 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210193 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210213 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210230 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210247 4983 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210269 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210287 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210304 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210322 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210344 4983 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210361 4983 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210377 4983 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210394 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210415 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210432 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210448 4983 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210465 4983 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210487 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210506 4983 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210594 4983 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210624 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210647 4983 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210667 4983 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210687 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210712 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210732 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210751 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210793 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210817 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210836 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210859 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210881 4983 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210906 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210925 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210941 4983 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210969 4983 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210992 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.214162 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206409 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206426 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206451 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.202561 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206907 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206928 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206963 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207262 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207292 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207333 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.207917 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.217426 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.217449 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.217585 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:53.717548592 +0000 UTC m=+82.317647032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208354 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208408 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208425 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208908 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209251 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210047 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210080 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210561 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210734 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.211553 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.211573 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.211802 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.211969 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.212081 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.212400 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.212716 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.213058 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.213141 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.213360 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.214009 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.214046 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.214234 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.215231 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.215327 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.216289 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.218172 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.218663 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.220345 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:53.720317762 +0000 UTC m=+82.320416202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.222291 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.226309 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.226355 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.227311 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.228783 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.229054 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.230009 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.230069 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.230142 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.231033 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.233020 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.235520 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.240355 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.240378 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.240392 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.240446 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:53.740424836 +0000 UTC m=+82.340523276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.244495 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.245532 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.245836 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.247991 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.248566 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.250163 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.250192 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.250202 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.250218 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.250229 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.250876 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.252026 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.252513 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.254102 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.256801 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.256979 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.257624 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.257984 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.258735 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.259613 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.259905 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.260308 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.260446 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.265786 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.266466 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.270599 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.275494 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.285662 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.293667 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.300295 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311655 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncxpw\" (UniqueName: \"kubernetes.io/projected/607f8329-b349-45da-bb9b-785740b4ad4f-kube-api-access-ncxpw\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311687 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311704 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-kubelet\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311721 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-system-cni-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311736 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trbxh\" (UniqueName: \"kubernetes.io/projected/43da17ff-aed1-44a2-a154-6800c3dd6ca9-kube-api-access-trbxh\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311775 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-node-log\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311806 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-os-release\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311827 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f81ec143-6c51-4f96-ae71-a4759bac7c70-cni-binary-copy\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311847 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-conf-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311854 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-system-cni-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311872 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/48a48757-a3b8-4d4d-92ba-6a2459a26a86-mcd-auth-proxy-config\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311875 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-kubelet\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311934 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6740e33-489f-4f45-b3e5-fdceaebf4301-hosts-file\") pod \"node-resolver-v748m\" (UID: \"b6740e33-489f-4f45-b3e5-fdceaebf4301\") " pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311905 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6740e33-489f-4f45-b3e5-fdceaebf4301-hosts-file\") pod \"node-resolver-v748m\" (UID: \"b6740e33-489f-4f45-b3e5-fdceaebf4301\") " pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311982 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311990 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-systemd\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312008 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-node-log\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312034 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/48a48757-a3b8-4d4d-92ba-6a2459a26a86-rootfs\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312043 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-systemd\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311988 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-os-release\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312065 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-slash\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312148 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-slash\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312102 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/48a48757-a3b8-4d4d-92ba-6a2459a26a86-rootfs\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312164 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312202 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312206 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-netns\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312228 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-netns\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312084 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-conf-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312254 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48a48757-a3b8-4d4d-92ba-6a2459a26a86-proxy-tls\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312286 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-kubelet\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312321 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43da17ff-aed1-44a2-a154-6800c3dd6ca9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312371 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-systemd-units\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312400 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-bin\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312427 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312457 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92wtj\" (UniqueName: \"kubernetes.io/projected/9138d88d-b777-4cab-b3d2-2099f01b205b-kube-api-access-92wtj\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312486 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-system-cni-dir\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312512 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9138d88d-b777-4cab-b3d2-2099f01b205b-cni-binary-copy\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312543 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-log-socket\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312569 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-env-overrides\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312596 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9138d88d-b777-4cab-b3d2-2099f01b205b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312625 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-multus-certs\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312641 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/48a48757-a3b8-4d4d-92ba-6a2459a26a86-mcd-auth-proxy-config\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312656 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43da17ff-aed1-44a2-a154-6800c3dd6ca9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312779 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43da17ff-aed1-44a2-a154-6800c3dd6ca9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312830 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-etc-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312871 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l24hn\" (UniqueName: \"kubernetes.io/projected/6993dda4-ac10-47af-b406-d49d7781fbe5-kube-api-access-l24hn\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312913 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-cni-multus\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312929 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w6zq\" (UniqueName: \"kubernetes.io/projected/48a48757-a3b8-4d4d-92ba-6a2459a26a86-kube-api-access-5w6zq\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312971 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjz9m\" (UniqueName: \"kubernetes.io/projected/f81ec143-6c51-4f96-ae71-a4759bac7c70-kube-api-access-gjz9m\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312992 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-ovn-kubernetes\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312883 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-system-cni-dir\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313029 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovn-node-metrics-cert\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313050 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-kubelet\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312983 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-systemd-units\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313375 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-bin\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313484 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-socket-dir-parent\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313522 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-daemon-config\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313612 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9138d88d-b777-4cab-b3d2-2099f01b205b-cni-binary-copy\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313622 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/607f8329-b349-45da-bb9b-785740b4ad4f-serviceca\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313679 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-netd\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313688 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-multus-certs\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313705 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-cnibin\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313737 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-cnibin\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313749 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-cni-bin\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313783 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-netd\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313814 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f81ec143-6c51-4f96-ae71-a4759bac7c70-cni-binary-copy\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313997 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-etc-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314033 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-log-socket\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314058 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-etc-kubernetes\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314469 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-netns\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314504 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-script-lib\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314522 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-var-lib-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314538 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-config\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314595 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-cnibin\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314611 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-hostroot\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314628 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-ovn\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314662 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-os-release\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314678 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-k8s-cni-cncf-io\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314698 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g8sn\" (UniqueName: \"kubernetes.io/projected/b6740e33-489f-4f45-b3e5-fdceaebf4301-kube-api-access-6g8sn\") pod \"node-resolver-v748m\" (UID: \"b6740e33-489f-4f45-b3e5-fdceaebf4301\") " pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314715 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314937 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43da17ff-aed1-44a2-a154-6800c3dd6ca9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314969 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43da17ff-aed1-44a2-a154-6800c3dd6ca9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314996 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-cni-bin\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315024 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-cnibin\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315068 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-socket-dir-parent\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315048 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315316 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-etc-kubernetes\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315338 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-netns\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315361 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-config\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315398 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-hostroot\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315423 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-ovn\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315469 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-os-release\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315498 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-k8s-cni-cncf-io\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315581 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-env-overrides\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315668 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-ovn-kubernetes\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315693 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-cni-multus\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316066 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9138d88d-b777-4cab-b3d2-2099f01b205b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316178 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/607f8329-b349-45da-bb9b-785740b4ad4f-host\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316242 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316245 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-daemon-config\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316267 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/607f8329-b349-45da-bb9b-785740b4ad4f-host\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316313 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88s5k\" (UniqueName: \"kubernetes.io/projected/f055dad5-7c9b-46a1-a715-34847c30d0cf-kube-api-access-88s5k\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316320 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316350 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316628 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-cni-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.316824 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.316896 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:53.816873802 +0000 UTC m=+82.416972282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316980 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317143 4983 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317149 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-cni-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317161 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317175 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317185 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317194 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317203 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317212 4983 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317221 4983 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317230 4983 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317239 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317246 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317255 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317266 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317274 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317285 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317294 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317302 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317312 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317321 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317329 4983 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317337 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317346 4983 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317354 4983 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317362 4983 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317370 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317395 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317406 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317417 4983 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317427 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317439 4983 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317453 4983 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317463 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317472 4983 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317480 4983 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317490 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317839 4983 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317851 4983 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317861 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326396 4983 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326423 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326433 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326442 4983 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326454 4983 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326463 4983 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326471 4983 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326480 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326488 4983 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326499 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326508 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326517 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326526 4983 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326535 4983 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326545 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326555 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326563 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326574 4983 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326582 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326591 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326800 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-var-lib-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326966 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/607f8329-b349-45da-bb9b-785740b4ad4f-serviceca\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326921 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.330124 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovn-node-metrics-cert\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.330605 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjz9m\" (UniqueName: \"kubernetes.io/projected/f81ec143-6c51-4f96-ae71-a4759bac7c70-kube-api-access-gjz9m\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.331146 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g8sn\" (UniqueName: \"kubernetes.io/projected/b6740e33-489f-4f45-b3e5-fdceaebf4301-kube-api-access-6g8sn\") pod \"node-resolver-v748m\" (UID: \"b6740e33-489f-4f45-b3e5-fdceaebf4301\") " pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.331385 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43da17ff-aed1-44a2-a154-6800c3dd6ca9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.333504 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92wtj\" (UniqueName: \"kubernetes.io/projected/9138d88d-b777-4cab-b3d2-2099f01b205b-kube-api-access-92wtj\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.334364 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l24hn\" (UniqueName: \"kubernetes.io/projected/6993dda4-ac10-47af-b406-d49d7781fbe5-kube-api-access-l24hn\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.335913 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.336911 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-script-lib\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.338112 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48a48757-a3b8-4d4d-92ba-6a2459a26a86-proxy-tls\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.339706 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88s5k\" (UniqueName: \"kubernetes.io/projected/f055dad5-7c9b-46a1-a715-34847c30d0cf-kube-api-access-88s5k\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.340120 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trbxh\" (UniqueName: \"kubernetes.io/projected/43da17ff-aed1-44a2-a154-6800c3dd6ca9-kube-api-access-trbxh\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.340159 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w6zq\" (UniqueName: \"kubernetes.io/projected/48a48757-a3b8-4d4d-92ba-6a2459a26a86-kube-api-access-5w6zq\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.341134 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncxpw\" (UniqueName: \"kubernetes.io/projected/607f8329-b349-45da-bb9b-785740b4ad4f-kube-api-access-ncxpw\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.347055 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.353655 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.353696 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.353708 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.353727 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.353742 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.385716 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.400171 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.416913 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.420071 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607f8329_b349_45da_bb9b_785740b4ad4f.slice/crio-588bdc17d12b0aa1813488ca2356b44515247409ae93e91e541d5edf68b51c3d WatchSource:0}: Error finding container 588bdc17d12b0aa1813488ca2356b44515247409ae93e91e541d5edf68b51c3d: Status 404 returned error can't find the container with id 588bdc17d12b0aa1813488ca2356b44515247409ae93e91e541d5edf68b51c3d Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.430939 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.440834 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.443662 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7507711e48881fcf269110901d0d8312cd087b99aada9b1d0b2b78795cb41a45"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.446083 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d2h5k" event={"ID":"607f8329-b349-45da-bb9b-785740b4ad4f","Type":"ContainerStarted","Data":"588bdc17d12b0aa1813488ca2356b44515247409ae93e91e541d5edf68b51c3d"} Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.446284 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-9d72640b89205babd66f836657a980095e4e77693e562bdc42d6ebc494b1bc12 WatchSource:0}: Error finding container 9d72640b89205babd66f836657a980095e4e77693e562bdc42d6ebc494b1bc12: Status 404 returned error can't find the container with id 9d72640b89205babd66f836657a980095e4e77693e562bdc42d6ebc494b1bc12 Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.447065 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.449562 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"28f7cda51e847143b66e94c1076dac6245ded9a97d4156b4bc50bbecefb4643c"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.456027 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.457161 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.457195 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.457209 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.457227 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.457241 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.465745 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81ec143_6c51_4f96_ae71_a4759bac7c70.slice/crio-6df1ad7a190117d0e59c8a25e7087c31c58bbe58f6ebf81b626cb51fe3232bc8 WatchSource:0}: Error finding container 6df1ad7a190117d0e59c8a25e7087c31c58bbe58f6ebf81b626cb51fe3232bc8: Status 404 returned error can't find the container with id 6df1ad7a190117d0e59c8a25e7087c31c58bbe58f6ebf81b626cb51fe3232bc8 Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.471138 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.497134 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6740e33_489f_4f45_b3e5_fdceaebf4301.slice/crio-0344d291246395ab2150363ce8c1b5a06aac354dbc364eac8e634905ec46c3da WatchSource:0}: Error finding container 0344d291246395ab2150363ce8c1b5a06aac354dbc364eac8e634905ec46c3da: Status 404 returned error can't find the container with id 0344d291246395ab2150363ce8c1b5a06aac354dbc364eac8e634905ec46c3da Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.504513 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.519785 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf055dad5_7c9b_46a1_a715_34847c30d0cf.slice/crio-a0c448e461e6c3fe1b265793bab80821f3f4c31a789d62361e918982254116d6 WatchSource:0}: Error finding container a0c448e461e6c3fe1b265793bab80821f3f4c31a789d62361e918982254116d6: Status 404 returned error can't find the container with id a0c448e461e6c3fe1b265793bab80821f3f4c31a789d62361e918982254116d6 Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.529026 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43da17ff_aed1_44a2_a154_6800c3dd6ca9.slice/crio-401ef267c7d1bfbcdcf811a82c62c6947a23b08ade4be7715748e3305b896730 WatchSource:0}: Error finding container 401ef267c7d1bfbcdcf811a82c62c6947a23b08ade4be7715748e3305b896730: Status 404 returned error can't find the container with id 401ef267c7d1bfbcdcf811a82c62c6947a23b08ade4be7715748e3305b896730 Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.553882 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.559169 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.559196 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.559205 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.559357 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.559373 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.580202 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9138d88d_b777_4cab_b3d2_2099f01b205b.slice/crio-9146b0a445a61236b7da18db6ee6e186c1e1013557072d9a5c5a27e2a534fa5d WatchSource:0}: Error finding container 9146b0a445a61236b7da18db6ee6e186c1e1013557072d9a5c5a27e2a534fa5d: Status 404 returned error can't find the container with id 9146b0a445a61236b7da18db6ee6e186c1e1013557072d9a5c5a27e2a534fa5d Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.663344 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.663372 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.663384 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.663398 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.663407 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.729461 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.729564 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.729595 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.729624 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.729705 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.729743 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:54.729729598 +0000 UTC m=+83.329828038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730064 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730085 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730116 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730142 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:07:54.73010726 +0000 UTC m=+83.330205710 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730178 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:54.730167742 +0000 UTC m=+83.330266302 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730201 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730362 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:54.730340408 +0000 UTC m=+83.330438828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.766701 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.766734 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.766744 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.766776 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.766787 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.830638 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.830676 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.830784 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.830828 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:54.830814095 +0000 UTC m=+83.430912525 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.830886 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.830948 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.830966 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.831047 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:54.831025372 +0000 UTC m=+83.431123802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.869205 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.869243 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.869254 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.869268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.869278 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.971899 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.971965 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.971973 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.971987 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.971995 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.075073 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.075121 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.075134 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.075151 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.075163 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.098249 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.099248 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.100681 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.101588 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.102955 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.103713 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.104566 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.106044 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.106935 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.108199 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.109071 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.110527 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.111185 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.111918 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.113113 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.113966 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.115269 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.115944 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.116691 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.118106 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.118727 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.120359 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.120964 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.121634 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.122286 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.123165 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.123981 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.124924 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.125484 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.125966 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.126823 4983 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.126929 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.128560 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.129471 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.129918 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.131354 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.132389 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.132928 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.133966 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.134620 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.135467 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.136173 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.137196 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.137932 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.138838 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.140631 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.141557 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.142298 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.143203 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.143679 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.144139 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.145137 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.145976 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.147038 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.176974 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.177017 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.177028 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.177045 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.177060 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.279466 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.279523 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.279538 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.279561 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.279578 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.382231 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.382275 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.382288 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.382304 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.382316 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.455123 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" exitCode=0 Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.455188 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.455264 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"a0c448e461e6c3fe1b265793bab80821f3f4c31a789d62361e918982254116d6"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.457328 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.459727 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.459772 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.459787 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"336a826201538546d8dcaa8c95a1b3d634f848ac4808afdaa7ffd62c732280c9"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.473229 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d2h5k" event={"ID":"607f8329-b349-45da-bb9b-785740b4ad4f","Type":"ContainerStarted","Data":"915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.475574 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerStarted","Data":"05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.475668 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerStarted","Data":"6df1ad7a190117d0e59c8a25e7087c31c58bbe58f6ebf81b626cb51fe3232bc8"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.476927 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.478207 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.478267 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.478278 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9d72640b89205babd66f836657a980095e4e77693e562bdc42d6ebc494b1bc12"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.479801 4983 generic.go:334] "Generic (PLEG): container finished" podID="9138d88d-b777-4cab-b3d2-2099f01b205b" containerID="ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495" exitCode=0 Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.479890 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerDied","Data":"ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.479949 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerStarted","Data":"9146b0a445a61236b7da18db6ee6e186c1e1013557072d9a5c5a27e2a534fa5d"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.482354 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" event={"ID":"43da17ff-aed1-44a2-a154-6800c3dd6ca9","Type":"ContainerStarted","Data":"78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.482386 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" event={"ID":"43da17ff-aed1-44a2-a154-6800c3dd6ca9","Type":"ContainerStarted","Data":"e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.482396 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" event={"ID":"43da17ff-aed1-44a2-a154-6800c3dd6ca9","Type":"ContainerStarted","Data":"401ef267c7d1bfbcdcf811a82c62c6947a23b08ade4be7715748e3305b896730"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484692 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v748m" event={"ID":"b6740e33-489f-4f45-b3e5-fdceaebf4301","Type":"ContainerStarted","Data":"23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484742 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v748m" event={"ID":"b6740e33-489f-4f45-b3e5-fdceaebf4301","Type":"ContainerStarted","Data":"0344d291246395ab2150363ce8c1b5a06aac354dbc364eac8e634905ec46c3da"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484830 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484878 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484889 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484905 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484917 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.494765 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.512268 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.526219 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.538031 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.553247 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.563644 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.584062 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.591101 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.591130 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.591141 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.591155 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.591164 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.604826 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.616058 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.629460 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.646356 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.657132 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.668041 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.679948 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.693818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.693860 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.693875 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.693891 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.693901 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.694784 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.707012 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.724862 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.736001 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.740226 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.740600 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.740635 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.740675 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.740798 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.740845 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:56.740832129 +0000 UTC m=+85.340930559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741176 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741189 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741199 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741227 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:56.741219892 +0000 UTC m=+85.341318322 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741266 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741294 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:56.741280964 +0000 UTC m=+85.341379394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741338 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:07:56.741331865 +0000 UTC m=+85.341430295 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.749690 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.761077 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.770459 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.781254 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.791104 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.795888 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.795932 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.795945 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.795962 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.795972 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.802567 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.821012 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.832067 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.841892 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.841960 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.842125 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.842145 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.842191 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.842261 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:56.842223636 +0000 UTC m=+85.442322066 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.842255 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.842350 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:56.84232693 +0000 UTC m=+85.442425410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.847939 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.897943 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.898258 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.898266 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.898283 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.898323 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.000898 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.000941 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.000952 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.000966 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.000977 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.091634 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:55 crc kubenswrapper[4983]: E0316 00:07:55.091741 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.092079 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:55 crc kubenswrapper[4983]: E0316 00:07:55.092152 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.092206 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:55 crc kubenswrapper[4983]: E0316 00:07:55.092259 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.092309 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:55 crc kubenswrapper[4983]: E0316 00:07:55.092366 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.103559 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.103612 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.103621 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.104167 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.104191 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.208517 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.208551 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.208558 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.208591 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.208602 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.311525 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.311739 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.311873 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.312017 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.312126 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.415233 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.415503 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.415512 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.415527 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.415538 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.490187 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerStarted","Data":"a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.492845 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.492906 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.492927 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.492946 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.492963 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.501662 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.512393 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.519087 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.519150 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.519165 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.519185 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.519201 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.526118 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.537948 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.553136 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.568377 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.587309 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.597215 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.608672 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.619184 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.620869 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.620906 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.620917 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.620931 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.620940 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.632730 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.643106 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.657375 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.671454 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.723031 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.723066 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.723074 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.723089 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.723099 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.825331 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.825388 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.825402 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.825431 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.825444 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.927951 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.928015 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.928030 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.928054 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.928069 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.031607 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.031663 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.031678 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.031697 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.031711 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.135148 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.135192 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.135201 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.135216 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.135226 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.239268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.239342 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.239365 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.239406 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.239428 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.342619 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.342690 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.342729 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.342810 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.342835 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.446420 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.446478 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.446495 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.446537 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.446554 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.507918 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.510078 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.512407 4983 generic.go:334] "Generic (PLEG): container finished" podID="9138d88d-b777-4cab-b3d2-2099f01b205b" containerID="a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c" exitCode=0 Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.512442 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerDied","Data":"a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.529227 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.542068 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.550013 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.550057 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.550076 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.550104 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.550124 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.559117 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.573623 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.588157 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.608121 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.632443 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.649456 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.652617 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.652660 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.652679 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.652702 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.652718 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.665845 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.680354 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.696294 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.710258 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.724797 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.741535 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.753723 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.755850 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.755877 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.755885 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.755900 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.755911 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.763472 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.763603 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.763631 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763704 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:00.763673501 +0000 UTC m=+89.363771971 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763717 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763836 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:00.763800185 +0000 UTC m=+89.363898615 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.763862 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763887 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763924 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763931 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763944 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763956 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:00.76394907 +0000 UTC m=+89.364047500 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.764005 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:00.763988051 +0000 UTC m=+89.364086511 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.770448 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.784202 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.798496 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.815055 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.826498 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.842378 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.852980 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.858432 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.858471 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.858482 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.858499 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.858511 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.864854 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.864983 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.865103 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.865127 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.865142 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.865194 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:00.865172651 +0000 UTC m=+89.465271101 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.865470 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.865503 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:00.865492972 +0000 UTC m=+89.465591412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.865689 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.876334 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.885914 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.895366 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.908620 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.929686 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.961260 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.961300 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.961311 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.961326 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.961334 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.064282 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.064325 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.064337 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.064354 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.064367 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.091925 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.091976 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.091985 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.092051 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:57 crc kubenswrapper[4983]: E0316 00:07:57.092090 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:07:57 crc kubenswrapper[4983]: E0316 00:07:57.092188 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:07:57 crc kubenswrapper[4983]: E0316 00:07:57.092285 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:07:57 crc kubenswrapper[4983]: E0316 00:07:57.092370 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.167574 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.167614 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.167628 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.167645 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.167690 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.273965 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.274035 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.274060 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.274082 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.274095 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.377508 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.377571 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.377589 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.377610 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.377624 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.479393 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.479448 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.479465 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.479490 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.479507 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.517402 4983 generic.go:334] "Generic (PLEG): container finished" podID="9138d88d-b777-4cab-b3d2-2099f01b205b" containerID="621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87" exitCode=0 Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.517446 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerDied","Data":"621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.530172 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.549876 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.559972 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.569216 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.580027 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.582438 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.582462 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.582470 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.582483 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.582492 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.591636 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.615657 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.627098 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.638782 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.650494 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.664611 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.677948 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.690956 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.691019 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.691038 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.691063 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.691082 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.695098 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.706281 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.793733 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.794225 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.794240 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.794259 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.794271 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.897509 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.897545 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.897553 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.897567 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.897579 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.000100 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.000142 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.000150 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.000164 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.000173 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.102990 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.103032 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.103044 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.103059 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.103071 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.206533 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.206622 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.206643 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.206666 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.206683 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.309914 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.309974 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.309991 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.310016 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.310033 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.413064 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.413163 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.413188 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.413266 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.413290 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.516815 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.516874 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.516890 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.516914 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.516932 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.527748 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.531990 4983 generic.go:334] "Generic (PLEG): container finished" podID="9138d88d-b777-4cab-b3d2-2099f01b205b" containerID="1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662" exitCode=0 Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.532048 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerDied","Data":"1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.553238 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.575026 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.592091 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.606020 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.619136 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.619217 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.619165 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.619229 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.619473 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.619500 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.629781 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.657449 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.673363 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.691490 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.705109 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.720601 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.721572 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.721617 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.721629 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.721647 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.721659 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.732291 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.750666 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.767384 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.824087 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.824125 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.824134 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.824150 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.824159 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.927080 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.927120 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.927130 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.927149 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.927159 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.029477 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.029517 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.029531 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.029548 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.029561 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.091856 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.091878 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.091971 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.091982 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:59 crc kubenswrapper[4983]: E0316 00:07:59.092132 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:07:59 crc kubenswrapper[4983]: E0316 00:07:59.092711 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:07:59 crc kubenswrapper[4983]: E0316 00:07:59.092849 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:07:59 crc kubenswrapper[4983]: E0316 00:07:59.092981 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.132346 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.132410 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.132429 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.132455 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.132472 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.235900 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.235944 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.235955 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.235973 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.235984 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.338604 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.338669 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.338687 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.338715 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.338734 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.441935 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.442034 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.442052 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.442079 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.442096 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.542020 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerStarted","Data":"6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.551713 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.552387 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.552730 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.553551 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.553946 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.562559 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.579355 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.599446 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.616231 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.627720 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.639119 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.658970 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.659055 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.659071 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.659099 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.659115 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.660174 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.673841 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.686373 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.698488 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.711557 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.723809 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.736930 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.752408 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.761673 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.761727 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.761738 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.761772 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.761794 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.864619 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.864664 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.864677 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.864695 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.864709 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.966947 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.966977 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.966985 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.966999 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.967008 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.069021 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.069053 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.069061 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.069075 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.069084 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.172072 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.172130 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.172152 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.172181 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.172203 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.273932 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.273964 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.273973 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.273985 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.274003 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.377186 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.377288 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.377836 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.377918 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.378199 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.481243 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.481300 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.481320 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.481343 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.481360 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.554237 4983 generic.go:334] "Generic (PLEG): container finished" podID="9138d88d-b777-4cab-b3d2-2099f01b205b" containerID="6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb" exitCode=0 Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.554338 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerDied","Data":"6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.561954 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.562918 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.562957 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.563081 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.572237 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.588041 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.588085 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.588097 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.588115 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.588127 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.589946 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.596548 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.598349 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.602201 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.614280 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.626653 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.639086 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.657187 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.669151 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.687121 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.695631 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.695668 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.695682 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.695701 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.695717 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.700294 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.712972 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.727252 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.736251 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.748043 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.758375 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.769414 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.781115 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.797234 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.802602 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.802635 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.802645 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.802660 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.802671 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.808014 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.808172 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808186 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:08.808168087 +0000 UTC m=+97.408266517 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.808211 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.808251 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808378 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808419 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808434 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808451 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808380 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808487 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:08.808467527 +0000 UTC m=+97.408566037 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808533 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:08.808500448 +0000 UTC m=+97.408599018 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808551 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:08.808545989 +0000 UTC m=+97.408644419 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.813015 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.826579 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.837145 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.845635 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.854825 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.865653 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.876291 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.884804 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.892084 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.904969 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.904999 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.905008 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.905022 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.905031 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.907268 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.909625 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.909653 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.909743 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.909790 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.909821 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.909835 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.909801 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:08.909788492 +0000 UTC m=+97.509886922 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.909909 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:08.909889195 +0000 UTC m=+97.509987675 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.007911 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.007947 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.007956 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.007971 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.007980 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.092337 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.092427 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:01 crc kubenswrapper[4983]: E0316 00:08:01.092485 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.092564 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:01 crc kubenswrapper[4983]: E0316 00:08:01.092723 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.092789 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:01 crc kubenswrapper[4983]: E0316 00:08:01.092854 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:01 crc kubenswrapper[4983]: E0316 00:08:01.092911 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.111716 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.111810 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.111838 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.111866 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.111889 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.213696 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.213741 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.213768 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.213784 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.213797 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.316017 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.316070 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.316083 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.316113 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.316129 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.418851 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.418903 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.418915 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.418931 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.418946 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.522135 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.522182 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.522193 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.522208 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.522224 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.568681 4983 generic.go:334] "Generic (PLEG): container finished" podID="9138d88d-b777-4cab-b3d2-2099f01b205b" containerID="f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537" exitCode=0 Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.568718 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerDied","Data":"f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.588748 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.606992 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.624003 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.624038 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.624051 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.624066 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.624079 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.628556 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.646623 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.667220 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.680108 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.693417 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.705636 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.722237 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.727401 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.727442 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.727454 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.727470 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.727482 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.731785 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.743018 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.756363 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.766896 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.786028 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.829121 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.829161 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.829174 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.829207 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.829220 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.931785 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.931843 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.931854 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.931871 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.932184 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.033803 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.033860 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.033871 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.033885 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.033893 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.078583 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.078626 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.078637 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.078653 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.078665 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: E0316 00:08:02.091951 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.095161 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.095193 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.095203 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.095219 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.095230 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.102069 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.102618 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: E0316 00:08:02.107008 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.109844 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.109881 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.109893 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.109909 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.109920 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.112994 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: E0316 00:08:02.120139 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.122905 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.122935 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.122946 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.122961 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.122975 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.123226 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.135995 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: E0316 00:08:02.140085 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.143463 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.143495 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.143507 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.143523 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.143538 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.153874 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: E0316 00:08:02.155335 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: E0316 00:08:02.155480 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.156663 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.156711 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.156722 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.156736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.156746 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.167968 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.192654 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.201199 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.211171 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.224803 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.236246 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.247225 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.256176 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.258935 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.258961 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.258987 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.259004 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.259015 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.272092 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.361530 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.361564 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.361574 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.361586 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.361594 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.464404 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.464449 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.464461 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.464476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.464485 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.567736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.567800 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.567812 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.567858 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.567867 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.577488 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerStarted","Data":"71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.591257 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.602079 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.612059 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.628771 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.640044 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.651700 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.662031 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.671487 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.671691 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.671811 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.671823 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.671840 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.671851 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.682029 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.692727 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.700660 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.709258 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.721115 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.734425 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.754022 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.773965 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.774027 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.774049 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.774078 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.774102 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.876794 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.876843 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.876855 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.876873 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.876887 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.979404 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.979705 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.979897 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.980080 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.980248 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.082644 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.082908 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.082997 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.083120 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.083199 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.092077 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.092145 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.092295 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:03 crc kubenswrapper[4983]: E0316 00:08:03.092355 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.092330 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:03 crc kubenswrapper[4983]: E0316 00:08:03.092477 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:03 crc kubenswrapper[4983]: E0316 00:08:03.092536 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:03 crc kubenswrapper[4983]: E0316 00:08:03.092582 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.186461 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.186497 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.186505 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.186522 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.186533 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.288434 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.288492 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.288510 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.288535 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.288552 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.391091 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.391431 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.391577 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.391719 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.391944 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.495252 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.495284 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.495294 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.495311 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.495322 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.581912 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/0.log" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.584459 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4" exitCode=1 Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.584491 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.585090 4983 scope.go:117] "RemoveContainer" containerID="ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.599116 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.599166 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.599188 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.599221 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.599244 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.601545 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.614977 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.629223 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.647437 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.661133 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.682219 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.695114 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.702419 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.702464 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.702480 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.702502 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.702517 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.707998 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.719745 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.735440 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.750073 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.761149 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.779020 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI0316 00:08:02.721307 6856 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:02.721365 6856 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:02.721399 6856 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:02.721894 6856 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:02.721941 6856 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:02.721950 6856 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:02.721975 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:02.721986 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:02.721991 6856 factory.go:656] Stopping watch factory\\\\nI0316 00:08:02.721998 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:02.722007 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:02.722026 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.790898 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.806094 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.806239 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.806265 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.806277 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.806292 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.806305 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.909504 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.909541 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.909552 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.909568 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.909579 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.012294 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.012380 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.012389 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.012403 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.012411 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.115373 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.115401 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.115409 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.115422 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.115430 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.218606 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.218633 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.218640 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.218652 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.218661 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.320585 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.320617 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.320625 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.320638 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.320647 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.423056 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.423095 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.423107 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.423121 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.423130 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.525809 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.525850 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.525859 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.525876 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.525887 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.589266 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/0.log" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.591330 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.591893 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.627963 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.628003 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.628016 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.628032 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.628045 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.729967 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.729991 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.729999 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.730013 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.730021 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.832051 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.832083 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.832092 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.832107 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.832117 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.935061 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.935107 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.935129 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.935156 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.935176 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.967784 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.983693 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.005985 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.021730 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.034603 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.037509 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.037767 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.037796 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.037818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.037832 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.049378 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.067476 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI0316 00:08:02.721307 6856 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:02.721365 6856 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:02.721399 6856 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:02.721894 6856 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:02.721941 6856 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:02.721950 6856 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:02.721975 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:02.721986 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:02.721991 6856 factory.go:656] Stopping watch factory\\\\nI0316 00:08:02.721998 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:02.722007 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:02.722026 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.081218 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.091924 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.092031 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.092096 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.092031 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.092187 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.092252 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.092566 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.092675 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.093062 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.103182 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.103720 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.104978 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.108447 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.119090 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.140128 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.140313 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.140427 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.140546 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.140646 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.141486 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.157065 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.168344 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.183910 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.242865 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.242908 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.242918 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.242934 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.242945 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.344924 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.344966 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.344978 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.344992 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.345002 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.448410 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.448457 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.448464 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.448480 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.448490 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.551062 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.551101 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.551142 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.551157 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.551166 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.596808 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/1.log" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.597534 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/0.log" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.600621 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc" exitCode=1 Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.601310 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.601527 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.602266 4983 scope.go:117] "RemoveContainer" containerID="b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.602404 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.602451 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.602485 4983 scope.go:117] "RemoveContainer" containerID="ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.617111 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.643163 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.653719 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.653788 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.653805 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.653827 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.653843 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.657345 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.671746 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.689102 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.700593 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.710278 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.719349 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.730073 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.750118 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI0316 00:08:02.721307 6856 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:02.721365 6856 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:02.721399 6856 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:02.721894 6856 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:02.721941 6856 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:02.721950 6856 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:02.721975 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:02.721986 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:02.721991 6856 factory.go:656] Stopping watch factory\\\\nI0316 00:08:02.721998 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:02.722007 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:02.722026 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:05Z\\\",\\\"message\\\":\\\".122358 7021 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:05.122886 7021 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123122 7021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:05.123531 7021 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123667 7021 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123871 7021 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.124278 7021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:05.124349 7021 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:05.124393 7021 factory.go:656] Stopping watch factory\\\\nI0316 00:08:05.124402 7021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:05.124412 7021 ovnkube.go:599] Stopped ovnkube\\\\nI0316 00:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.756441 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.756472 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.756485 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.756499 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.756510 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.763851 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.777726 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.791070 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.804371 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.815678 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.831452 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.858845 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.858874 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.858885 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.858900 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.858912 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.960957 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.960996 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.961007 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.961028 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.961042 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.063043 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.063117 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.063136 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.063171 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.063205 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.165501 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.165543 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.165554 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.165569 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.165580 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.268003 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.268046 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.268056 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.268070 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.268081 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.370464 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.370564 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.370589 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.370620 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.370644 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.472894 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.472934 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.472942 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.472955 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.472963 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.575579 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.575642 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.575666 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.575695 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.575717 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.605537 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/1.log" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.609673 4983 scope.go:117] "RemoveContainer" containerID="b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc" Mar 16 00:08:06 crc kubenswrapper[4983]: E0316 00:08:06.610004 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.622842 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.634097 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.647178 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.659898 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.674265 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.678468 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.678517 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.678530 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.678548 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.678560 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.685835 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.705676 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:05Z\\\",\\\"message\\\":\\\".122358 7021 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:05.122886 7021 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123122 7021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:05.123531 7021 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123667 7021 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123871 7021 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.124278 7021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:05.124349 7021 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:05.124393 7021 factory.go:656] Stopping watch factory\\\\nI0316 00:08:05.124402 7021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:05.124412 7021 ovnkube.go:599] Stopped ovnkube\\\\nI0316 00:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.718031 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.731211 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.743279 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.753885 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.763186 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.777854 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.781484 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.781525 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.781537 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.781580 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.781593 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.790427 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.804271 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.819559 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.883741 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.883818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.883829 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.883845 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.883856 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.986374 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.986797 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.987471 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.987796 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.988021 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.090490 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.090538 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.090547 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.090566 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.090577 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.091652 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.091649 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.091796 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.091946 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:07 crc kubenswrapper[4983]: E0316 00:08:07.091938 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:07 crc kubenswrapper[4983]: E0316 00:08:07.092176 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:07 crc kubenswrapper[4983]: E0316 00:08:07.092239 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:07 crc kubenswrapper[4983]: E0316 00:08:07.092348 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.193902 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.193955 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.193969 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.193991 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.194007 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.297209 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.297249 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.297260 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.297278 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.297291 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.399437 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.399478 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.399489 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.399507 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.399517 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.501957 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.501993 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.502003 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.502019 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.502031 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.605956 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.606387 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.606604 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.606792 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.606945 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.710521 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.710578 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.710596 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.710622 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.710641 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.812817 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.812853 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.812862 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.812883 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.812899 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.915861 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.915952 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.915984 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.915998 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.916007 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.018377 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.018413 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.018420 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.018434 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.018448 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.120356 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.120396 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.120407 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.120421 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.120431 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.222377 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.222428 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.222443 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.222466 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.222477 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.324740 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.324839 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.324857 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.324882 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.324900 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.427012 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.427080 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.427092 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.427129 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.427143 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.529837 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.529884 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.529896 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.529916 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.529930 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.632102 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.632139 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.632148 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.632163 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.632176 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.687175 4983 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.734327 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.735057 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.735201 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.735292 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.735381 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.837609 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.838009 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.838198 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.838408 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.838540 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.890323 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.890407 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.890441 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.890471 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890581 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890609 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890629 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.890605876 +0000 UTC m=+113.490704306 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890643 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890669 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890674 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.890666178 +0000 UTC m=+113.490764608 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890742 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.89072 +0000 UTC m=+113.490818460 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890744 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.891015 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.890919856 +0000 UTC m=+113.491018346 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.941181 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.941219 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.941229 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.941243 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.941253 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.992356 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.992457 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.992629 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.992705 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.992729 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.992653 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.992838 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.99281224 +0000 UTC m=+113.592910710 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.992868 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.992855331 +0000 UTC m=+113.592953791 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.044416 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.044454 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.044462 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.044478 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.044488 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.092107 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.092142 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.092107 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.092124 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:09 crc kubenswrapper[4983]: E0316 00:08:09.092236 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:09 crc kubenswrapper[4983]: E0316 00:08:09.092367 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:09 crc kubenswrapper[4983]: E0316 00:08:09.092440 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:09 crc kubenswrapper[4983]: E0316 00:08:09.092480 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.146652 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.146693 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.146703 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.146721 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.146730 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.249863 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.249900 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.249911 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.249927 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.249939 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.353876 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.353932 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.353954 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.353983 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.354007 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.457853 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.458561 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.458691 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.458852 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.459269 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.563291 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.563359 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.563377 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.563403 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.563422 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.666516 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.666582 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.666604 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.666632 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.666653 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.769923 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.769999 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.770025 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.770052 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.770069 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.873561 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.873639 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.873660 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.873691 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.873713 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.986061 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.986121 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.986141 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.986170 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.986196 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.089681 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.089797 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.089815 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.089837 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.089854 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.192521 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.192694 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.192722 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.192791 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.192826 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.295924 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.296233 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.296322 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.296433 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.296536 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.399954 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.400307 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.400467 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.400622 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.400802 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.504937 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.505078 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.505102 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.505132 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.505150 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.608146 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.608210 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.608229 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.608253 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.608271 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.711499 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.711566 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.711584 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.711606 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.711623 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.813993 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.814945 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.815000 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.815031 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.815052 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.918200 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.918267 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.918291 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.918319 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.918340 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.021365 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.021427 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.021445 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.021470 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.021486 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.091932 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.091971 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.092013 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:11 crc kubenswrapper[4983]: E0316 00:08:11.092089 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.092126 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:11 crc kubenswrapper[4983]: E0316 00:08:11.092249 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:11 crc kubenswrapper[4983]: E0316 00:08:11.092397 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:11 crc kubenswrapper[4983]: E0316 00:08:11.092516 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.123291 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.123363 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.123384 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.123412 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.123433 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.225397 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.225439 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.225452 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.225471 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.225485 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.327220 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.327256 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.327268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.327284 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.327295 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.429129 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.429175 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.429188 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.429207 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.429220 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.531875 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.531948 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.531970 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.532001 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.532022 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.633849 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.633890 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.633901 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.633914 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.633924 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.737029 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.737069 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.737080 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.737100 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.737112 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.839547 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.839609 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.839630 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.839655 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.839673 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.942488 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.942551 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.942569 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.942595 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.942613 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.045295 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.045358 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.045372 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.045397 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.045411 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.111116 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.131485 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.148952 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.149011 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.149028 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.149082 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.149100 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.155362 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.182044 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.203602 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.222638 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.244840 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.251579 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.251644 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.251663 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.251691 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.251710 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.253347 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.253522 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.253628 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.253742 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.253863 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.263202 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: E0316 00:08:12.272481 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.277355 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.277402 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.277415 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.277431 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.277442 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.279399 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.294803 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: E0316 00:08:12.298736 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.304240 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.304317 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.304335 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.304359 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.304376 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.312869 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: E0316 00:08:12.323312 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.326548 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.328268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.328331 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.328348 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.328372 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.328388 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.341180 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: E0316 00:08:12.346175 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.350904 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.350996 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.351016 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.351042 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.351098 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.356728 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: E0316 00:08:12.371567 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: E0316 00:08:12.372381 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.375021 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.375091 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.375138 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.375515 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.375913 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.376023 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.405832 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:05Z\\\",\\\"message\\\":\\\".122358 7021 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:05.122886 7021 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123122 7021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:05.123531 7021 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123667 7021 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123871 7021 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.124278 7021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:05.124349 7021 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:05.124393 7021 factory.go:656] Stopping watch factory\\\\nI0316 00:08:05.124402 7021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:05.124412 7021 ovnkube.go:599] Stopped ovnkube\\\\nI0316 00:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.480824 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.480932 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.480952 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.481014 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.481033 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.583679 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.583857 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.583915 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.583944 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.583994 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.687215 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.687277 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.687295 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.687317 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.687337 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.791315 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.791386 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.791408 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.791436 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.791457 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.894640 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.894717 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.894739 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.894816 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.894845 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.998614 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.998676 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.998689 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.998719 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.998738 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.092275 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.092338 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.092339 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.092356 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:13 crc kubenswrapper[4983]: E0316 00:08:13.092493 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:13 crc kubenswrapper[4983]: E0316 00:08:13.092709 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:13 crc kubenswrapper[4983]: E0316 00:08:13.092881 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:13 crc kubenswrapper[4983]: E0316 00:08:13.093162 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.101894 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.101944 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.101998 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.102021 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.102035 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.205082 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.205123 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.205133 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.205149 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.205159 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.308612 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.308665 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.308681 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.308702 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.308719 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.410705 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.410770 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.410781 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.410798 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.410812 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.516364 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.516419 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.516436 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.516461 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.516478 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.618722 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.618777 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.618786 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.618802 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.618811 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.722097 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.722174 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.722196 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.722225 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.722247 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.825430 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.825462 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.825471 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.825485 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.825495 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.928425 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.928476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.928491 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.928508 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.928518 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.031064 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.031182 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.031202 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.031229 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.031248 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.133961 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.134025 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.134043 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.134068 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.134085 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.237500 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.237546 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.237554 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.237569 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.237578 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.339992 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.340036 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.340053 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.340070 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.340080 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.443064 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.443131 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.443154 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.443184 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.443206 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.546195 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.546239 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.546250 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.546266 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.546280 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.648681 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.648738 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.648793 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.648818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.648834 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.693423 4983 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.752062 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.752116 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.752132 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.752156 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.752174 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.855199 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.855261 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.855286 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.855315 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.855334 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.958898 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.958950 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.958989 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.959015 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.959032 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.061730 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.061802 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.061815 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.061834 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.061849 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.092385 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.092430 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.092468 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:15 crc kubenswrapper[4983]: E0316 00:08:15.092569 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.092585 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:15 crc kubenswrapper[4983]: E0316 00:08:15.092667 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:15 crc kubenswrapper[4983]: E0316 00:08:15.092738 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:15 crc kubenswrapper[4983]: E0316 00:08:15.092804 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.164230 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.164270 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.164284 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.164305 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.164318 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.267069 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.267105 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.267116 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.267132 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.267143 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.369346 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.369398 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.369413 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.369430 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.369446 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.472019 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.472046 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.472056 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.472068 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.472079 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.573814 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.573868 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.573877 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.573892 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.573901 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.676620 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.676735 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.676787 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.676822 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.676850 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.779659 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.779714 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.779726 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.779745 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.779779 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.883255 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.883336 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.883351 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.883378 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.883397 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.986355 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.987072 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.987132 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.987166 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.987190 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.089384 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.089452 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.089471 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.089496 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.089515 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.192442 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.192575 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.192604 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.192638 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.192662 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.295480 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.295834 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.295982 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.296115 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.296237 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.399743 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.399840 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.399860 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.399889 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.399907 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.502566 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.502619 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.502633 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.502653 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.502668 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.604912 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.604960 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.604971 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.604987 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.604998 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.707709 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.708003 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.708119 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.708236 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.708397 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.810773 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.810841 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.810860 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.810887 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.810906 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.913927 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.913997 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.914022 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.914052 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.914075 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.018170 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.018239 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.018260 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.018284 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.018299 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.092463 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.092519 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.092574 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.092691 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:17 crc kubenswrapper[4983]: E0316 00:08:17.092923 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:17 crc kubenswrapper[4983]: E0316 00:08:17.093508 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:17 crc kubenswrapper[4983]: E0316 00:08:17.093702 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:17 crc kubenswrapper[4983]: E0316 00:08:17.093958 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.094110 4983 scope.go:117] "RemoveContainer" containerID="b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.120818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.120860 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.120869 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.120884 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.120895 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.224187 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.224620 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.224642 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.224672 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.224692 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.327948 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.327988 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.327999 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.328017 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.328028 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.430475 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.430530 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.430548 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.430566 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.430579 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.533446 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.533497 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.533535 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.533558 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.533570 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.635480 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.635528 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.635539 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.635554 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.635567 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.649668 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/1.log" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.652181 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.652718 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.663613 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.678495 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.696541 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.708313 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.717142 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.728478 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.737980 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.738019 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.738030 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.738043 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.738053 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.739097 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.751475 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.760638 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.777603 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:05Z\\\",\\\"message\\\":\\\".122358 7021 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:05.122886 7021 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123122 7021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:05.123531 7021 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123667 7021 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123871 7021 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.124278 7021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:05.124349 7021 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:05.124393 7021 factory.go:656] Stopping watch factory\\\\nI0316 00:08:05.124402 7021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:05.124412 7021 ovnkube.go:599] Stopped ovnkube\\\\nI0316 00:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.787570 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.799350 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.814602 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.824773 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.840251 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.840307 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.840324 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.840342 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.840354 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.842125 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.857617 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.943471 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.943567 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.943582 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.943600 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.943612 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.046923 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.046985 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.047003 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.047028 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.047047 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.093445 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.110375 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.150019 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.150059 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.150069 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.150085 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.150102 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.252402 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.252439 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.252452 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.252469 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.252480 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.355057 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.355089 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.355100 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.355116 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.355126 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.458851 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.458928 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.458951 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.458983 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.459005 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.562252 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.562302 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.562318 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.562340 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.562356 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.658590 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.661603 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.662261 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.664042 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/2.log" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.664352 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.664426 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.664452 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.664483 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.664509 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.665043 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/1.log" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.668308 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a" exitCode=1 Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.668380 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.668448 4983 scope.go:117] "RemoveContainer" containerID="b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.669299 4983 scope.go:117] "RemoveContainer" containerID="4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a" Mar 16 00:08:18 crc kubenswrapper[4983]: E0316 00:08:18.669537 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.701817 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.716301 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.735326 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.747560 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.758852 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.768578 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.768648 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.768662 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.768678 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.768691 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.773659 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.789320 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.818195 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:05Z\\\",\\\"message\\\":\\\".122358 7021 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:05.122886 7021 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123122 7021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:05.123531 7021 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123667 7021 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123871 7021 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.124278 7021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:05.124349 7021 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:05.124393 7021 factory.go:656] Stopping watch factory\\\\nI0316 00:08:05.124402 7021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:05.124412 7021 ovnkube.go:599] Stopped ovnkube\\\\nI0316 00:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.833508 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.850327 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.866236 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.871172 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.871210 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.871222 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.871240 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.871251 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.880677 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.897937 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.913377 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.930370 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.948915 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.964965 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.973604 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.973688 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.973706 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.973723 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.973736 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.979539 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.994878 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.024903 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.043312 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.061712 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.076614 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.076676 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.076690 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.076715 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.076731 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.077395 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.092393 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.092436 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:19 crc kubenswrapper[4983]: E0316 00:08:19.092523 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.092551 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.092581 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:19 crc kubenswrapper[4983]: E0316 00:08:19.092728 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:19 crc kubenswrapper[4983]: E0316 00:08:19.092864 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:19 crc kubenswrapper[4983]: E0316 00:08:19.092940 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.097224 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.111928 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.128415 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.143374 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.155819 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.170971 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.179407 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.179451 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.179460 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.179476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.179487 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.185186 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.202075 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.227480 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.246030 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.273104 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:05Z\\\",\\\"message\\\":\\\".122358 7021 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:05.122886 7021 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123122 7021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:05.123531 7021 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123667 7021 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123871 7021 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.124278 7021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:05.124349 7021 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:05.124393 7021 factory.go:656] Stopping watch factory\\\\nI0316 00:08:05.124402 7021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:05.124412 7021 ovnkube.go:599] Stopped ovnkube\\\\nI0316 00:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.283851 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.283876 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.283884 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.283898 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.283907 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.387086 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.387144 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.387160 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.387184 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.387201 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.490148 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.490184 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.490192 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.490205 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.490214 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.592864 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.593121 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.593192 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.593270 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.593334 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.674179 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/2.log" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.679858 4983 scope.go:117] "RemoveContainer" containerID="4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a" Mar 16 00:08:19 crc kubenswrapper[4983]: E0316 00:08:19.680115 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.694291 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.696379 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.696540 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.696575 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.696605 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.696626 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.715179 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.732049 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.743696 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.757741 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.788702 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.799070 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.799141 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.799166 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.799198 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.799219 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.808522 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.821050 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.832987 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.842717 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.861496 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.873105 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.885402 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.901020 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.902313 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.902343 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.902366 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.902386 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.902398 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.920506 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.930870 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.945732 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.004072 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.004101 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.004109 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.004123 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.004133 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.106572 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.107270 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.107379 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.107485 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.107573 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.209738 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.209856 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.209882 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.209913 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.209936 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.312937 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.313302 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.313468 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.313615 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.313809 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.416577 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.416637 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.416655 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.416680 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.416698 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.520220 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.520637 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.520896 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.521108 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.521266 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.624421 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.624487 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.624508 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.624533 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.624550 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.727699 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.727744 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.727781 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.727800 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.727812 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.830873 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.830906 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.830940 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.830958 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.830970 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.933570 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.933641 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.933658 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.933681 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.933723 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.036978 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.037036 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.037046 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.037062 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.037071 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.091735 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.091798 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.091941 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:21 crc kubenswrapper[4983]: E0316 00:08:21.092070 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.092129 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:21 crc kubenswrapper[4983]: E0316 00:08:21.092286 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:21 crc kubenswrapper[4983]: E0316 00:08:21.092504 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:21 crc kubenswrapper[4983]: E0316 00:08:21.092604 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.139805 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.140518 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.140636 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.140747 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.140892 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.244598 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.245011 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.245169 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.245319 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.245463 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.348126 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.348178 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.348194 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.348218 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.348235 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.451326 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.451736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.451989 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.452306 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.452506 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.556067 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.556508 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.556744 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.556980 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.557117 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.661110 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.661155 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.661171 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.661195 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.661216 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.765747 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.765852 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.765868 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.765889 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.765905 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.868225 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.868265 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.868273 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.868286 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.868294 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.971673 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.971745 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.971773 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.971792 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.972164 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.074134 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.074162 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.074170 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.074182 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.074191 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.105154 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.121190 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.136936 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.162301 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.176613 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.176653 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.176667 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.176684 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.176697 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.178513 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.192697 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.203282 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.213061 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.223387 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.233973 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.250417 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.260236 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.270998 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.278278 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.278323 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.278338 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.278360 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.278376 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.281857 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.293171 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.302900 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.315808 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.381089 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.381151 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.381167 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.381189 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.381206 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.484777 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.484830 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.484846 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.484865 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.484875 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.587643 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.587732 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.587789 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.587817 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.587835 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.689650 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.689749 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.689829 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.689855 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.689873 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.754319 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.754743 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.755032 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.755302 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.755538 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: E0316 00:08:22.770448 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.776131 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.776192 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.776216 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.776244 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.776266 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: E0316 00:08:22.793110 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.797352 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.797416 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.797439 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.797467 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.797489 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: E0316 00:08:22.810364 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.815479 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.815536 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.815551 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.815577 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.815592 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: E0316 00:08:22.828568 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.833110 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.833202 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.833256 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.833280 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.833300 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: E0316 00:08:22.847636 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: E0316 00:08:22.847888 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.850053 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.850103 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.850120 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.850141 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.850156 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.952324 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.952422 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.952439 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.952507 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.952531 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.056922 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.057402 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.057445 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.057482 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.057508 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.091689 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.091714 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.091811 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:23 crc kubenswrapper[4983]: E0316 00:08:23.092192 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:23 crc kubenswrapper[4983]: E0316 00:08:23.092319 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.091841 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:23 crc kubenswrapper[4983]: E0316 00:08:23.092534 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:23 crc kubenswrapper[4983]: E0316 00:08:23.092578 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.161737 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.162140 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.162319 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.162514 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.163444 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.266354 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.266395 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.266407 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.266423 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.266434 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.369714 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.369819 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.369839 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.369865 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.369883 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.472321 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.472567 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.472635 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.472699 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.472785 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.575835 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.575887 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.575904 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.575940 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.575954 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.678193 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.678243 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.678261 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.678285 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.678305 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.781375 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.781439 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.781463 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.781493 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.781516 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.885115 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.885172 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.885189 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.885213 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.885231 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.988531 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.988592 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.988609 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.988632 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.988648 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.091541 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.091609 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.091633 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.091656 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.091675 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.194659 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.195135 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.195157 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.195185 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.195202 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.298046 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.298114 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.298130 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.298156 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.298174 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.400903 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.400976 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.401002 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.401035 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.401057 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.505164 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.505209 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.505223 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.505244 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.505259 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.608693 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.608785 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.608803 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.608832 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.608849 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.711948 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.711989 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.711997 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.712013 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.712022 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.814532 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.814689 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.814712 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.814734 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.814751 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.917598 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.917676 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.917694 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.917723 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.917745 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.970194 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.970309 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.970390 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.970425 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970517 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:56.970489238 +0000 UTC m=+145.570587678 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970590 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970616 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970676 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970697 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970599 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970698 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:56.970672514 +0000 UTC m=+145.570770984 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970830 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:56.970805378 +0000 UTC m=+145.570903888 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970856 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:56.970842489 +0000 UTC m=+145.570941049 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.020719 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.020810 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.020831 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.020905 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.020931 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.071603 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.071683 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.071852 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.071902 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.071920 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.071927 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.072026 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:57.072002039 +0000 UTC m=+145.672100509 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.072465 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:57.072374721 +0000 UTC m=+145.672473171 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.092562 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.092627 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.092671 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.092671 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.093049 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.093094 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.093172 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.093252 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.124420 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.124476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.124496 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.124520 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.124540 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.227302 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.227364 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.227375 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.227393 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.227406 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.330317 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.330371 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.330387 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.330409 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.330425 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.432931 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.433030 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.433068 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.433098 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.433118 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.536363 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.536438 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.536456 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.536481 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.536499 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.639479 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.639543 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.639567 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.639598 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.639621 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.742107 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.742173 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.742193 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.742217 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.742235 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.844932 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.844983 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.844997 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.845018 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.845031 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.948429 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.948467 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.948476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.948490 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.948499 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.051206 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.051250 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.051261 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.051279 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.051292 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.153679 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.153725 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.153736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.153766 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.153778 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.256360 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.256394 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.256401 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.256416 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.256424 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.359175 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.359268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.359294 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.359321 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.359338 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.462031 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.462081 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.462093 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.462110 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.462123 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.564147 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.564195 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.564208 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.564225 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.564237 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.666840 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.666890 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.666901 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.666919 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.666931 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.769812 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.769877 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.769902 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.769925 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.769940 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.872547 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.872586 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.872597 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.872613 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.872625 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.974820 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.974878 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.974898 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.974924 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.974943 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.077998 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.078074 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.078092 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.078118 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.078137 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.092559 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.092634 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.092668 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.092557 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:27 crc kubenswrapper[4983]: E0316 00:08:27.092730 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:27 crc kubenswrapper[4983]: E0316 00:08:27.092888 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:27 crc kubenswrapper[4983]: E0316 00:08:27.093150 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:27 crc kubenswrapper[4983]: E0316 00:08:27.093235 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.181672 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.181736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.181782 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.181811 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.181828 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.284261 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.284301 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.284310 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.284329 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.284340 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.386442 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.386489 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.386507 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.386523 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.386537 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.489734 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.489791 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.489800 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.489821 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.489831 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.591792 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.591818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.591825 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.591839 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.591848 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.694158 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.694205 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.694221 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.694238 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.694250 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.796872 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.796920 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.796928 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.796943 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.796953 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.900390 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.900457 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.900474 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.900500 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.900518 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.003448 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.003524 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.003544 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.003572 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.003592 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.106476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.106610 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.106650 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.106688 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.106712 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.209694 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.209793 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.209811 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.209834 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.209855 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.312381 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.312450 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.312469 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.312488 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.312500 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.415368 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.415421 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.415437 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.415461 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.415478 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.519196 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.519279 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.519301 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.519327 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.519347 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.621906 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.621978 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.622008 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.622041 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.622061 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.724855 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.724918 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.724935 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.724962 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.724981 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.828088 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.828143 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.828166 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.828193 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.828215 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.930819 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.930868 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.930884 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.930909 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.930926 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.034667 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.034738 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.034803 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.034834 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.034857 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.092110 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:29 crc kubenswrapper[4983]: E0316 00:08:29.092422 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.092193 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:29 crc kubenswrapper[4983]: E0316 00:08:29.092654 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.092225 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.092136 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:29 crc kubenswrapper[4983]: E0316 00:08:29.092901 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:29 crc kubenswrapper[4983]: E0316 00:08:29.093081 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.137464 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.137516 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.137528 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.137545 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.137559 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.240078 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.240138 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.240154 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.240178 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.240195 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.342546 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.342611 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.342631 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.342655 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.342675 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.446050 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.446104 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.446120 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.446141 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.446158 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.548522 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.548588 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.548605 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.548631 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.548727 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.650952 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.650993 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.651005 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.651020 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.651030 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.753574 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.753642 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.753663 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.753693 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.753715 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.856591 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.856642 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.856659 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.856683 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.856702 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.959688 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.959748 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.959817 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.959849 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.959871 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.062715 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.062812 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.062837 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.062860 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.062876 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.165575 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.165612 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.165623 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.165640 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.165653 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.269376 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.269419 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.269427 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.269442 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.269451 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.371548 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.371587 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.371595 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.371611 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.371620 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.474964 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.475118 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.475149 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.475226 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.475253 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.578106 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.578183 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.578197 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.578250 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.578268 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.681716 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.682071 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.682198 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.682387 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.682553 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.785575 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.786036 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.786239 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.786993 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.787200 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.890715 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.890783 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.890795 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.890812 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.890823 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.994492 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.994545 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.994557 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.994576 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.994591 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.091929 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.091985 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.092050 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:31 crc kubenswrapper[4983]: E0316 00:08:31.092112 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.092132 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:31 crc kubenswrapper[4983]: E0316 00:08:31.092253 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:31 crc kubenswrapper[4983]: E0316 00:08:31.092393 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:31 crc kubenswrapper[4983]: E0316 00:08:31.092482 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.097601 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.097705 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.097749 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.097910 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.097928 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.201889 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.202261 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.202463 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.202666 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.202938 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.306903 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.307224 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.307409 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.307552 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.307678 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.410020 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.410090 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.410113 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.410144 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.410166 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.513341 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.513397 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.513414 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.513440 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.513459 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.616542 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.616967 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.617182 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.617412 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.617606 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.720382 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.720462 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.720485 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.720517 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.720538 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.823818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.824057 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.824120 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.824242 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.824338 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.926645 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.926918 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.926996 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.927075 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.927152 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.029231 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.029271 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.029282 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.029301 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.029323 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.095995 4983 scope.go:117] "RemoveContainer" containerID="4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a" Mar 16 00:08:32 crc kubenswrapper[4983]: E0316 00:08:32.096595 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.110863 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.111704 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.127577 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: E0316 00:08:32.129803 4983 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.146690 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.166122 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.179969 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: E0316 00:08:32.186196 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.206075 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.220365 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.232308 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.244287 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.256801 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.274329 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.286714 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.298533 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.312339 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.327980 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.341245 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.355569 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.080178 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.080226 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.080242 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.080258 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.080268 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.092127 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.092127 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.092156 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.092637 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.092369 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.092162 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.092663 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.093289 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.092692 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.096566 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.096608 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.096624 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.096645 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.096661 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.107942 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.111150 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.111184 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.111193 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.111207 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.111216 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.121894 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.125358 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.125451 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.125465 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.125479 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.125491 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.139900 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.143432 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.143476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.143489 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.143507 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.143520 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.162780 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.162936 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:35 crc kubenswrapper[4983]: I0316 00:08:35.091790 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:35 crc kubenswrapper[4983]: E0316 00:08:35.092743 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:35 crc kubenswrapper[4983]: I0316 00:08:35.091862 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:35 crc kubenswrapper[4983]: E0316 00:08:35.093047 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:35 crc kubenswrapper[4983]: I0316 00:08:35.091834 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:35 crc kubenswrapper[4983]: E0316 00:08:35.093302 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:35 crc kubenswrapper[4983]: I0316 00:08:35.091936 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:35 crc kubenswrapper[4983]: E0316 00:08:35.093538 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.309794 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.324986 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.341406 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.356315 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.373635 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.396882 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.439524 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.457772 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.477407 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.505490 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.523137 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.540698 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.558509 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.579876 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.599628 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.622411 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.634105 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.656201 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.670536 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4983]: I0316 00:08:37.092130 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:37 crc kubenswrapper[4983]: I0316 00:08:37.092248 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:37 crc kubenswrapper[4983]: I0316 00:08:37.092135 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:37 crc kubenswrapper[4983]: E0316 00:08:37.092333 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:37 crc kubenswrapper[4983]: E0316 00:08:37.092434 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:37 crc kubenswrapper[4983]: I0316 00:08:37.092562 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:37 crc kubenswrapper[4983]: E0316 00:08:37.092643 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:37 crc kubenswrapper[4983]: E0316 00:08:37.092848 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:37 crc kubenswrapper[4983]: E0316 00:08:37.187960 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:39 crc kubenswrapper[4983]: I0316 00:08:39.091639 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:39 crc kubenswrapper[4983]: E0316 00:08:39.091868 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:39 crc kubenswrapper[4983]: I0316 00:08:39.091647 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:39 crc kubenswrapper[4983]: I0316 00:08:39.091645 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:39 crc kubenswrapper[4983]: E0316 00:08:39.092030 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:39 crc kubenswrapper[4983]: I0316 00:08:39.091664 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:39 crc kubenswrapper[4983]: E0316 00:08:39.092183 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:39 crc kubenswrapper[4983]: E0316 00:08:39.092286 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.106806 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.754025 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/0.log" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.754412 4983 generic.go:334] "Generic (PLEG): container finished" podID="f81ec143-6c51-4f96-ae71-a4759bac7c70" containerID="05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7" exitCode=1 Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.754485 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerDied","Data":"05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7"} Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.755369 4983 scope.go:117] "RemoveContainer" containerID="05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.770114 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.786189 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.806085 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.825301 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.837619 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.848846 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.861431 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.874479 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.891183 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.907613 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.927435 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.942260 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.958211 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.970267 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.984878 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:40.999999 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.011510 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.025205 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.035455 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.091951 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.091951 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.091964 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.092020 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:41 crc kubenswrapper[4983]: E0316 00:08:41.092111 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:41 crc kubenswrapper[4983]: E0316 00:08:41.092169 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:41 crc kubenswrapper[4983]: E0316 00:08:41.092216 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:41 crc kubenswrapper[4983]: E0316 00:08:41.092260 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.760174 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/0.log" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.760253 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerStarted","Data":"dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9"} Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.777660 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.791633 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.803986 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.823935 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.840215 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.855384 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.868466 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.881249 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.895279 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.906855 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.919042 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.932375 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.941885 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.950855 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.962497 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.975982 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.988960 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.009142 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.027802 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.104674 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.118035 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.133491 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.149566 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.162308 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.175927 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: E0316 00:08:42.188470 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.189279 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.206530 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.217926 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.230686 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.244192 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.261108 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.272958 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.286797 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.298049 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.308316 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.320666 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.340742 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.361448 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.092587 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.092682 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.092706 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.092815 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.093001 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.093173 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.092612 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.093391 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.403615 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.403665 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.403692 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.403709 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.403720 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.417724 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.422239 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.422287 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.422302 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.422321 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.422335 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.437373 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.442357 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.442412 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.442429 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.442451 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.442466 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.456834 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.461233 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.461291 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.461306 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.461327 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.461344 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.478232 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.491534 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.491581 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.491596 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.491618 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.491635 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.561907 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.562081 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.092356 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.092367 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:45 crc kubenswrapper[4983]: E0316 00:08:45.092461 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.092521 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.093091 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:45 crc kubenswrapper[4983]: E0316 00:08:45.093262 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:45 crc kubenswrapper[4983]: E0316 00:08:45.093307 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:45 crc kubenswrapper[4983]: E0316 00:08:45.093432 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.093832 4983 scope.go:117] "RemoveContainer" containerID="4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.774785 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/2.log" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.777985 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.778484 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.791153 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.801790 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.812041 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.823895 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.834503 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.851050 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.863312 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.871524 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.882154 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.892070 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.899699 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.907902 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.918392 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.931203 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.950735 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.961999 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.972824 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.982975 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.999212 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.784837 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/3.log" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.786341 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/2.log" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.791497 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" exitCode=1 Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.791564 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.791621 4983 scope.go:117] "RemoveContainer" containerID="4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.792808 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:08:46 crc kubenswrapper[4983]: E0316 00:08:46.793141 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.814522 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.832868 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.851000 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.870517 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.888814 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.914045 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.928048 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.938547 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.954550 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.981479 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.993230 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.005025 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.018454 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.034362 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.064900 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.081068 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.091861 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.091915 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.091951 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.091936 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:47 crc kubenswrapper[4983]: E0316 00:08:47.092039 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:47 crc kubenswrapper[4983]: E0316 00:08:47.092165 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:47 crc kubenswrapper[4983]: E0316 00:08:47.092272 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:47 crc kubenswrapper[4983]: E0316 00:08:47.092322 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.100705 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.113208 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.129563 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:46Z\\\",\\\"message\\\":\\\".NetworkPolicy event handler 4\\\\nI0316 00:08:46.005091 7514 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:46.005096 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:46.005063 7514 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:46.005114 7514 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:46.005131 7514 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0316 00:08:46.005141 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:46.005164 7514 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:46.005200 7514 factory.go:656] Stopping watch factory\\\\nI0316 00:08:46.005212 7514 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:46.005241 7514 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:46.005258 7514 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:46.005266 7514 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: E0316 00:08:47.190173 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.798572 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/3.log" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.805027 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:08:47 crc kubenswrapper[4983]: E0316 00:08:47.805837 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.821858 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.839111 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.858815 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.877612 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.893497 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.912221 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.939263 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.953539 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.969212 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.982006 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.994501 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.004746 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.026850 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:46Z\\\",\\\"message\\\":\\\".NetworkPolicy event handler 4\\\\nI0316 00:08:46.005091 7514 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:46.005096 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:46.005063 7514 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:46.005114 7514 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:46.005131 7514 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0316 00:08:46.005141 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:46.005164 7514 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:46.005200 7514 factory.go:656] Stopping watch factory\\\\nI0316 00:08:46.005212 7514 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:46.005241 7514 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:46.005258 7514 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:46.005266 7514 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.039997 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.052571 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.064899 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.076254 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.087486 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.103835 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4983]: I0316 00:08:49.092563 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:49 crc kubenswrapper[4983]: I0316 00:08:49.092637 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:49 crc kubenswrapper[4983]: I0316 00:08:49.092688 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:49 crc kubenswrapper[4983]: I0316 00:08:49.092588 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:49 crc kubenswrapper[4983]: E0316 00:08:49.092815 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:49 crc kubenswrapper[4983]: E0316 00:08:49.092933 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:49 crc kubenswrapper[4983]: E0316 00:08:49.093178 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:49 crc kubenswrapper[4983]: E0316 00:08:49.093276 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:51 crc kubenswrapper[4983]: I0316 00:08:51.092104 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:51 crc kubenswrapper[4983]: I0316 00:08:51.092164 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:51 crc kubenswrapper[4983]: I0316 00:08:51.092212 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:51 crc kubenswrapper[4983]: I0316 00:08:51.092266 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:51 crc kubenswrapper[4983]: E0316 00:08:51.092402 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:51 crc kubenswrapper[4983]: E0316 00:08:51.092516 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:51 crc kubenswrapper[4983]: E0316 00:08:51.092626 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:51 crc kubenswrapper[4983]: E0316 00:08:51.092819 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.111951 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.134209 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.151243 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.168474 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.185970 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: E0316 00:08:52.190943 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.210923 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.233225 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.269869 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.302192 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:46Z\\\",\\\"message\\\":\\\".NetworkPolicy event handler 4\\\\nI0316 00:08:46.005091 7514 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:46.005096 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:46.005063 7514 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:46.005114 7514 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:46.005131 7514 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0316 00:08:46.005141 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:46.005164 7514 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:46.005200 7514 factory.go:656] Stopping watch factory\\\\nI0316 00:08:46.005212 7514 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:46.005241 7514 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:46.005258 7514 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:46.005266 7514 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.321554 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.342008 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.358465 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.383689 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.404348 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.433069 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.448312 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.461541 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.473465 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.483798 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.092415 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.092438 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.092547 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.092636 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.093133 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.093288 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.093405 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.093487 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.747076 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.747637 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.747710 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.747796 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.747865 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:53Z","lastTransitionTime":"2026-03-16T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.763589 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.767900 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.767944 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.767953 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.767972 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.767982 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:53Z","lastTransitionTime":"2026-03-16T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.786520 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.790714 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.790776 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.790791 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.790807 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.790820 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:53Z","lastTransitionTime":"2026-03-16T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.805578 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.809673 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.809703 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.809711 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.809724 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.809734 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:53Z","lastTransitionTime":"2026-03-16T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.827382 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.830980 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.831077 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.831143 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.831209 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.831279 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:53Z","lastTransitionTime":"2026-03-16T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.847818 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.847971 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:55 crc kubenswrapper[4983]: I0316 00:08:55.092382 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:55 crc kubenswrapper[4983]: E0316 00:08:55.092493 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:55 crc kubenswrapper[4983]: I0316 00:08:55.092536 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:55 crc kubenswrapper[4983]: I0316 00:08:55.092571 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:55 crc kubenswrapper[4983]: E0316 00:08:55.092808 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:55 crc kubenswrapper[4983]: I0316 00:08:55.093180 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:55 crc kubenswrapper[4983]: E0316 00:08:55.093276 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:55 crc kubenswrapper[4983]: E0316 00:08:55.093403 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.006745 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.006943 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.00691601 +0000 UTC m=+209.607014480 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.007129 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.007205 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.007253 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007340 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007383 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007408 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.007391823 +0000 UTC m=+209.607490293 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007451 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.007433344 +0000 UTC m=+209.607531784 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007549 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007567 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007581 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007620 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.007608199 +0000 UTC m=+209.607706729 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.092290 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.092364 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.092403 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.092361 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.092531 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.092645 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.092840 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.093012 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.108558 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.108689 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.108703 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.108713 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.108750 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.108737857 +0000 UTC m=+209.708836287 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.108924 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.108930 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.109022 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.108997524 +0000 UTC m=+209.709096054 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.192896 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:59 crc kubenswrapper[4983]: I0316 00:08:59.092158 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:59 crc kubenswrapper[4983]: I0316 00:08:59.092232 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:59 crc kubenswrapper[4983]: I0316 00:08:59.092195 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:59 crc kubenswrapper[4983]: I0316 00:08:59.092185 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:59 crc kubenswrapper[4983]: E0316 00:08:59.092397 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:59 crc kubenswrapper[4983]: E0316 00:08:59.092538 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:59 crc kubenswrapper[4983]: E0316 00:08:59.092742 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:59 crc kubenswrapper[4983]: E0316 00:08:59.092870 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:01 crc kubenswrapper[4983]: I0316 00:09:01.091929 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:01 crc kubenswrapper[4983]: I0316 00:09:01.091982 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:01 crc kubenswrapper[4983]: I0316 00:09:01.091969 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:01 crc kubenswrapper[4983]: I0316 00:09:01.091945 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:01 crc kubenswrapper[4983]: E0316 00:09:01.092124 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:01 crc kubenswrapper[4983]: E0316 00:09:01.092322 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:01 crc kubenswrapper[4983]: E0316 00:09:01.092450 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:01 crc kubenswrapper[4983]: E0316 00:09:01.092658 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.093288 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:09:02 crc kubenswrapper[4983]: E0316 00:09:02.093650 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.112712 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.128389 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.145604 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.161040 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.174192 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.188729 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: E0316 00:09:02.193405 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.224428 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.239085 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.257726 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:46Z\\\",\\\"message\\\":\\\".NetworkPolicy event handler 4\\\\nI0316 00:08:46.005091 7514 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:46.005096 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:46.005063 7514 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:46.005114 7514 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:46.005131 7514 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0316 00:08:46.005141 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:46.005164 7514 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:46.005200 7514 factory.go:656] Stopping watch factory\\\\nI0316 00:08:46.005212 7514 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:46.005241 7514 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:46.005258 7514 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:46.005266 7514 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.269362 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.281586 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.295600 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.307861 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.319114 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.330573 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.344464 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.359551 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.368911 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.381037 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:03 crc kubenswrapper[4983]: I0316 00:09:03.091934 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:03 crc kubenswrapper[4983]: I0316 00:09:03.091948 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:03 crc kubenswrapper[4983]: I0316 00:09:03.092274 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:03 crc kubenswrapper[4983]: E0316 00:09:03.092251 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:03 crc kubenswrapper[4983]: E0316 00:09:03.092424 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:03 crc kubenswrapper[4983]: I0316 00:09:03.092405 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:03 crc kubenswrapper[4983]: E0316 00:09:03.092448 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:03 crc kubenswrapper[4983]: E0316 00:09:03.092669 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.043402 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.043459 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.043475 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.043499 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.043515 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:04Z","lastTransitionTime":"2026-03-16T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:04 crc kubenswrapper[4983]: E0316 00:09:04.058898 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.063050 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.063097 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.063108 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.063125 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.063137 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:04Z","lastTransitionTime":"2026-03-16T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:04 crc kubenswrapper[4983]: E0316 00:09:04.081652 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.085153 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.085223 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.085238 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.085264 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.085299 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:04Z","lastTransitionTime":"2026-03-16T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:04 crc kubenswrapper[4983]: E0316 00:09:04.100627 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.104681 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.104705 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.104713 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.104726 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.104737 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:04Z","lastTransitionTime":"2026-03-16T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:04 crc kubenswrapper[4983]: E0316 00:09:04.122513 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.126264 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.126317 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.126340 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.126367 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.126385 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:04Z","lastTransitionTime":"2026-03-16T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:04 crc kubenswrapper[4983]: E0316 00:09:04.142865 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:04 crc kubenswrapper[4983]: E0316 00:09:04.143001 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:05 crc kubenswrapper[4983]: I0316 00:09:05.092130 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:05 crc kubenswrapper[4983]: I0316 00:09:05.092184 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:05 crc kubenswrapper[4983]: I0316 00:09:05.092229 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:05 crc kubenswrapper[4983]: E0316 00:09:05.092258 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:05 crc kubenswrapper[4983]: I0316 00:09:05.092155 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:05 crc kubenswrapper[4983]: E0316 00:09:05.092328 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:05 crc kubenswrapper[4983]: E0316 00:09:05.092440 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:05 crc kubenswrapper[4983]: E0316 00:09:05.092478 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:07 crc kubenswrapper[4983]: I0316 00:09:07.092290 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:07 crc kubenswrapper[4983]: I0316 00:09:07.092385 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:07 crc kubenswrapper[4983]: E0316 00:09:07.092516 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:07 crc kubenswrapper[4983]: I0316 00:09:07.092317 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:07 crc kubenswrapper[4983]: E0316 00:09:07.092666 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:07 crc kubenswrapper[4983]: I0316 00:09:07.092747 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:07 crc kubenswrapper[4983]: E0316 00:09:07.092874 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:07 crc kubenswrapper[4983]: E0316 00:09:07.093111 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:07 crc kubenswrapper[4983]: E0316 00:09:07.194696 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:09 crc kubenswrapper[4983]: I0316 00:09:09.092409 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:09 crc kubenswrapper[4983]: I0316 00:09:09.092458 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:09 crc kubenswrapper[4983]: E0316 00:09:09.093227 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:09 crc kubenswrapper[4983]: I0316 00:09:09.092560 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:09 crc kubenswrapper[4983]: I0316 00:09:09.092500 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:09 crc kubenswrapper[4983]: E0316 00:09:09.093398 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:09 crc kubenswrapper[4983]: E0316 00:09:09.093576 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:09 crc kubenswrapper[4983]: E0316 00:09:09.093686 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:11 crc kubenswrapper[4983]: I0316 00:09:11.092406 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:11 crc kubenswrapper[4983]: I0316 00:09:11.092457 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:11 crc kubenswrapper[4983]: I0316 00:09:11.092478 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:11 crc kubenswrapper[4983]: E0316 00:09:11.092554 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:11 crc kubenswrapper[4983]: I0316 00:09:11.092575 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:11 crc kubenswrapper[4983]: E0316 00:09:11.092797 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:11 crc kubenswrapper[4983]: E0316 00:09:11.092924 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:11 crc kubenswrapper[4983]: E0316 00:09:11.093080 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.115658 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.140119 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.159411 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.181177 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: E0316 00:09:12.195851 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.204497 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.217471 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.235841 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.252268 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.267809 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.279575 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.308987 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.319549 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.330802 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.340431 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.352394 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.362742 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.378924 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:46Z\\\",\\\"message\\\":\\\".NetworkPolicy event handler 4\\\\nI0316 00:08:46.005091 7514 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:46.005096 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:46.005063 7514 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:46.005114 7514 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:46.005131 7514 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0316 00:08:46.005141 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:46.005164 7514 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:46.005200 7514 factory.go:656] Stopping watch factory\\\\nI0316 00:08:46.005212 7514 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:46.005241 7514 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:46.005258 7514 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:46.005266 7514 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.390924 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.405262 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:13 crc kubenswrapper[4983]: I0316 00:09:13.092443 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:13 crc kubenswrapper[4983]: I0316 00:09:13.092482 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:13 crc kubenswrapper[4983]: I0316 00:09:13.092482 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:13 crc kubenswrapper[4983]: I0316 00:09:13.092540 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:13 crc kubenswrapper[4983]: E0316 00:09:13.093099 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:13 crc kubenswrapper[4983]: E0316 00:09:13.093242 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:13 crc kubenswrapper[4983]: E0316 00:09:13.093426 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:13 crc kubenswrapper[4983]: E0316 00:09:13.093551 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.464295 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.464341 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.464352 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.464392 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.464411 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:14Z","lastTransitionTime":"2026-03-16T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:14 crc kubenswrapper[4983]: E0316 00:09:14.479587 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.484180 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.484254 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.484276 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.484306 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.484327 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:14Z","lastTransitionTime":"2026-03-16T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:14 crc kubenswrapper[4983]: E0316 00:09:14.504365 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.509353 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.509427 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.509452 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.509482 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.509505 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:14Z","lastTransitionTime":"2026-03-16T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:14 crc kubenswrapper[4983]: E0316 00:09:14.529309 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.534007 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.534070 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.534088 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.534111 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.534128 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:14Z","lastTransitionTime":"2026-03-16T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:14 crc kubenswrapper[4983]: E0316 00:09:14.551020 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.555816 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.555856 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.555864 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.555878 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.555887 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:14Z","lastTransitionTime":"2026-03-16T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:14 crc kubenswrapper[4983]: E0316 00:09:14.573000 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:14 crc kubenswrapper[4983]: E0316 00:09:14.573103 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:15 crc kubenswrapper[4983]: I0316 00:09:15.092224 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:15 crc kubenswrapper[4983]: I0316 00:09:15.092311 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:15 crc kubenswrapper[4983]: E0316 00:09:15.092384 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:15 crc kubenswrapper[4983]: I0316 00:09:15.092407 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:15 crc kubenswrapper[4983]: I0316 00:09:15.092309 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:15 crc kubenswrapper[4983]: E0316 00:09:15.092508 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:15 crc kubenswrapper[4983]: E0316 00:09:15.092613 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:15 crc kubenswrapper[4983]: E0316 00:09:15.092829 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:17 crc kubenswrapper[4983]: I0316 00:09:17.092009 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:17 crc kubenswrapper[4983]: I0316 00:09:17.092201 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:17 crc kubenswrapper[4983]: E0316 00:09:17.092236 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:17 crc kubenswrapper[4983]: I0316 00:09:17.092200 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:17 crc kubenswrapper[4983]: I0316 00:09:17.092208 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:17 crc kubenswrapper[4983]: E0316 00:09:17.092489 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:17 crc kubenswrapper[4983]: I0316 00:09:17.092790 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:09:17 crc kubenswrapper[4983]: E0316 00:09:17.092935 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:09:17 crc kubenswrapper[4983]: E0316 00:09:17.093011 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:17 crc kubenswrapper[4983]: E0316 00:09:17.093095 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:17 crc kubenswrapper[4983]: E0316 00:09:17.196696 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:19 crc kubenswrapper[4983]: I0316 00:09:19.092313 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:19 crc kubenswrapper[4983]: E0316 00:09:19.092463 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:19 crc kubenswrapper[4983]: I0316 00:09:19.092486 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:19 crc kubenswrapper[4983]: I0316 00:09:19.092565 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:19 crc kubenswrapper[4983]: I0316 00:09:19.092684 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:19 crc kubenswrapper[4983]: E0316 00:09:19.092796 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:19 crc kubenswrapper[4983]: E0316 00:09:19.092675 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:19 crc kubenswrapper[4983]: E0316 00:09:19.093244 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:21 crc kubenswrapper[4983]: I0316 00:09:21.092628 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:21 crc kubenswrapper[4983]: I0316 00:09:21.092656 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:21 crc kubenswrapper[4983]: I0316 00:09:21.092727 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:21 crc kubenswrapper[4983]: I0316 00:09:21.092735 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:21 crc kubenswrapper[4983]: E0316 00:09:21.092878 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:21 crc kubenswrapper[4983]: E0316 00:09:21.093008 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:21 crc kubenswrapper[4983]: E0316 00:09:21.093166 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:21 crc kubenswrapper[4983]: E0316 00:09:21.093311 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.113248 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=80.113233765 podStartE2EDuration="1m20.113233765s" podCreationTimestamp="2026-03-16 00:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.112956058 +0000 UTC m=+170.713054488" watchObservedRunningTime="2026-03-16 00:09:22.113233765 +0000 UTC m=+170.713332195" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.129237 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.129222445 podStartE2EDuration="1m17.129222445s" podCreationTimestamp="2026-03-16 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.128726641 +0000 UTC m=+170.728825111" watchObservedRunningTime="2026-03-16 00:09:22.129222445 +0000 UTC m=+170.729320875" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.161250 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.161228604 podStartE2EDuration="42.161228604s" podCreationTimestamp="2026-03-16 00:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.160888814 +0000 UTC m=+170.760987244" watchObservedRunningTime="2026-03-16 00:09:22.161228604 +0000 UTC m=+170.761327044" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.187068 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=64.187049453 podStartE2EDuration="1m4.187049453s" podCreationTimestamp="2026-03-16 00:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.184129613 +0000 UTC m=+170.784228053" watchObservedRunningTime="2026-03-16 00:09:22.187049453 +0000 UTC m=+170.787147903" Mar 16 00:09:22 crc kubenswrapper[4983]: E0316 00:09:22.197184 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.238189 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-v748m" podStartSLOduration=124.238166897 podStartE2EDuration="2m4.238166897s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.227539435 +0000 UTC m=+170.827637865" watchObservedRunningTime="2026-03-16 00:09:22.238166897 +0000 UTC m=+170.838265327" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.238626 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d2h5k" podStartSLOduration=124.23861867 podStartE2EDuration="2m4.23861867s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.238563588 +0000 UTC m=+170.838662018" watchObservedRunningTime="2026-03-16 00:09:22.23861867 +0000 UTC m=+170.838717100" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.257400 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podStartSLOduration=123.257385805 podStartE2EDuration="2m3.257385805s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.256934683 +0000 UTC m=+170.857033113" watchObservedRunningTime="2026-03-16 00:09:22.257385805 +0000 UTC m=+170.857484235" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.288280 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=50.288263853 podStartE2EDuration="50.288263853s" podCreationTimestamp="2026-03-16 00:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.268731897 +0000 UTC m=+170.868830327" watchObservedRunningTime="2026-03-16 00:09:22.288263853 +0000 UTC m=+170.888362283" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.324290 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tqncp" podStartSLOduration=123.324272032 podStartE2EDuration="2m3.324272032s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.3241911 +0000 UTC m=+170.924289530" watchObservedRunningTime="2026-03-16 00:09:22.324272032 +0000 UTC m=+170.924370462" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.366420 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" podStartSLOduration=123.36639407 podStartE2EDuration="2m3.36639407s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.345795504 +0000 UTC m=+170.945893934" watchObservedRunningTime="2026-03-16 00:09:22.36639407 +0000 UTC m=+170.966492510" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.367041 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" podStartSLOduration=123.367034707 podStartE2EDuration="2m3.367034707s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.365936867 +0000 UTC m=+170.966035297" watchObservedRunningTime="2026-03-16 00:09:22.367034707 +0000 UTC m=+170.967133157" Mar 16 00:09:23 crc kubenswrapper[4983]: I0316 00:09:23.092614 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:23 crc kubenswrapper[4983]: I0316 00:09:23.092649 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:23 crc kubenswrapper[4983]: E0316 00:09:23.092896 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:23 crc kubenswrapper[4983]: I0316 00:09:23.092939 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:23 crc kubenswrapper[4983]: I0316 00:09:23.092957 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:23 crc kubenswrapper[4983]: E0316 00:09:23.093095 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:23 crc kubenswrapper[4983]: E0316 00:09:23.093331 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:23 crc kubenswrapper[4983]: E0316 00:09:23.093523 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.702320 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.702363 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.702372 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.702387 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.702397 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:24Z","lastTransitionTime":"2026-03-16T00:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.760638 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m"] Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.760994 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.763561 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.766414 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.766950 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.767167 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.896229 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fb13872-d3dc-4349-b763-f46e4cc112d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.896283 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2fb13872-d3dc-4349-b763-f46e4cc112d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.896319 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2fb13872-d3dc-4349-b763-f46e4cc112d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.896356 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2fb13872-d3dc-4349-b763-f46e4cc112d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.896395 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2fb13872-d3dc-4349-b763-f46e4cc112d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.997847 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fb13872-d3dc-4349-b763-f46e4cc112d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.997887 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2fb13872-d3dc-4349-b763-f46e4cc112d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.997925 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2fb13872-d3dc-4349-b763-f46e4cc112d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.997951 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2fb13872-d3dc-4349-b763-f46e4cc112d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.997970 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2fb13872-d3dc-4349-b763-f46e4cc112d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.998037 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2fb13872-d3dc-4349-b763-f46e4cc112d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.998157 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2fb13872-d3dc-4349-b763-f46e4cc112d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.998869 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2fb13872-d3dc-4349-b763-f46e4cc112d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.003357 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fb13872-d3dc-4349-b763-f46e4cc112d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.016503 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2fb13872-d3dc-4349-b763-f46e4cc112d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.076220 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.091514 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:25 crc kubenswrapper[4983]: E0316 00:09:25.091682 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.091983 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.092029 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:25 crc kubenswrapper[4983]: E0316 00:09:25.092259 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:25 crc kubenswrapper[4983]: E0316 00:09:25.092449 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.092710 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:25 crc kubenswrapper[4983]: E0316 00:09:25.092902 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.140593 4983 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.148503 4983 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.938065 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" event={"ID":"2fb13872-d3dc-4349-b763-f46e4cc112d5","Type":"ContainerStarted","Data":"fd381275245e78bc69f2ac7cb422a2e9888d68627fad78d28e5f114a9a1b7eb0"} Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.938202 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" event={"ID":"2fb13872-d3dc-4349-b763-f46e4cc112d5","Type":"ContainerStarted","Data":"fdc0ccc239745118906714b20613e4415d7d481bf0b908fa5ae77271fe1d1f8c"} Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.955183 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" podStartSLOduration=126.955158178 podStartE2EDuration="2m6.955158178s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:25.954913012 +0000 UTC m=+174.555011442" watchObservedRunningTime="2026-03-16 00:09:25.955158178 +0000 UTC m=+174.555256648" Mar 16 00:09:26 crc kubenswrapper[4983]: I0316 00:09:26.942610 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/1.log" Mar 16 00:09:26 crc kubenswrapper[4983]: I0316 00:09:26.943329 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/0.log" Mar 16 00:09:26 crc kubenswrapper[4983]: I0316 00:09:26.943355 4983 generic.go:334] "Generic (PLEG): container finished" podID="f81ec143-6c51-4f96-ae71-a4759bac7c70" containerID="dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9" exitCode=1 Mar 16 00:09:26 crc kubenswrapper[4983]: I0316 00:09:26.943379 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerDied","Data":"dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9"} Mar 16 00:09:26 crc kubenswrapper[4983]: I0316 00:09:26.943414 4983 scope.go:117] "RemoveContainer" containerID="05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7" Mar 16 00:09:26 crc kubenswrapper[4983]: I0316 00:09:26.943746 4983 scope.go:117] "RemoveContainer" containerID="dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9" Mar 16 00:09:26 crc kubenswrapper[4983]: E0316 00:09:26.943954 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-tqncp_openshift-multus(f81ec143-6c51-4f96-ae71-a4759bac7c70)\"" pod="openshift-multus/multus-tqncp" podUID="f81ec143-6c51-4f96-ae71-a4759bac7c70" Mar 16 00:09:27 crc kubenswrapper[4983]: I0316 00:09:27.091622 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:27 crc kubenswrapper[4983]: I0316 00:09:27.091697 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:27 crc kubenswrapper[4983]: I0316 00:09:27.091634 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:27 crc kubenswrapper[4983]: I0316 00:09:27.091778 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:27 crc kubenswrapper[4983]: E0316 00:09:27.091776 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:27 crc kubenswrapper[4983]: E0316 00:09:27.091889 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:27 crc kubenswrapper[4983]: E0316 00:09:27.092008 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:27 crc kubenswrapper[4983]: E0316 00:09:27.092064 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:27 crc kubenswrapper[4983]: E0316 00:09:27.198681 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:27 crc kubenswrapper[4983]: I0316 00:09:27.948877 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/1.log" Mar 16 00:09:29 crc kubenswrapper[4983]: I0316 00:09:29.092230 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:29 crc kubenswrapper[4983]: I0316 00:09:29.092273 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:29 crc kubenswrapper[4983]: E0316 00:09:29.092385 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:29 crc kubenswrapper[4983]: I0316 00:09:29.092236 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:29 crc kubenswrapper[4983]: I0316 00:09:29.092258 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:29 crc kubenswrapper[4983]: E0316 00:09:29.092569 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:29 crc kubenswrapper[4983]: E0316 00:09:29.092677 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:29 crc kubenswrapper[4983]: E0316 00:09:29.092835 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.093653 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.958185 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qvtjp"] Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.958306 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:30 crc kubenswrapper[4983]: E0316 00:09:30.958402 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.962316 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/3.log" Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.964261 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.965210 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.995173 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podStartSLOduration=131.995155931 podStartE2EDuration="2m11.995155931s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:30.992950341 +0000 UTC m=+179.593048771" watchObservedRunningTime="2026-03-16 00:09:30.995155931 +0000 UTC m=+179.595254361" Mar 16 00:09:31 crc kubenswrapper[4983]: I0316 00:09:31.091566 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:31 crc kubenswrapper[4983]: I0316 00:09:31.091566 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:31 crc kubenswrapper[4983]: E0316 00:09:31.091691 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:31 crc kubenswrapper[4983]: I0316 00:09:31.091574 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:31 crc kubenswrapper[4983]: E0316 00:09:31.091803 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:31 crc kubenswrapper[4983]: E0316 00:09:31.091872 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:32 crc kubenswrapper[4983]: I0316 00:09:32.092231 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:32 crc kubenswrapper[4983]: E0316 00:09:32.093217 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:32 crc kubenswrapper[4983]: E0316 00:09:32.199334 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:33 crc kubenswrapper[4983]: I0316 00:09:33.091822 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:33 crc kubenswrapper[4983]: I0316 00:09:33.091848 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:33 crc kubenswrapper[4983]: I0316 00:09:33.091932 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:33 crc kubenswrapper[4983]: E0316 00:09:33.091993 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:33 crc kubenswrapper[4983]: E0316 00:09:33.092023 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:33 crc kubenswrapper[4983]: E0316 00:09:33.092189 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:34 crc kubenswrapper[4983]: I0316 00:09:34.092108 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:34 crc kubenswrapper[4983]: E0316 00:09:34.093290 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:35 crc kubenswrapper[4983]: I0316 00:09:35.092044 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:35 crc kubenswrapper[4983]: I0316 00:09:35.092124 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:35 crc kubenswrapper[4983]: I0316 00:09:35.092128 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:35 crc kubenswrapper[4983]: E0316 00:09:35.092245 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:35 crc kubenswrapper[4983]: E0316 00:09:35.092391 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:35 crc kubenswrapper[4983]: E0316 00:09:35.092499 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:36 crc kubenswrapper[4983]: I0316 00:09:36.092610 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:36 crc kubenswrapper[4983]: E0316 00:09:36.092939 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:37 crc kubenswrapper[4983]: I0316 00:09:37.092494 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:37 crc kubenswrapper[4983]: I0316 00:09:37.092573 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:37 crc kubenswrapper[4983]: E0316 00:09:37.092686 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:37 crc kubenswrapper[4983]: I0316 00:09:37.092525 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:37 crc kubenswrapper[4983]: E0316 00:09:37.092962 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:37 crc kubenswrapper[4983]: E0316 00:09:37.093173 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:37 crc kubenswrapper[4983]: E0316 00:09:37.201358 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:38 crc kubenswrapper[4983]: I0316 00:09:38.092878 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:38 crc kubenswrapper[4983]: E0316 00:09:38.093076 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:39 crc kubenswrapper[4983]: I0316 00:09:39.092088 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:39 crc kubenswrapper[4983]: E0316 00:09:39.092486 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:39 crc kubenswrapper[4983]: I0316 00:09:39.092142 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:39 crc kubenswrapper[4983]: E0316 00:09:39.092568 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:39 crc kubenswrapper[4983]: I0316 00:09:39.092124 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:39 crc kubenswrapper[4983]: E0316 00:09:39.092642 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:40 crc kubenswrapper[4983]: I0316 00:09:40.092415 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:40 crc kubenswrapper[4983]: E0316 00:09:40.092571 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:41 crc kubenswrapper[4983]: I0316 00:09:41.091554 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:41 crc kubenswrapper[4983]: I0316 00:09:41.091645 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:41 crc kubenswrapper[4983]: I0316 00:09:41.091707 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:41 crc kubenswrapper[4983]: E0316 00:09:41.091701 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:41 crc kubenswrapper[4983]: E0316 00:09:41.091860 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:41 crc kubenswrapper[4983]: E0316 00:09:41.091919 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:41 crc kubenswrapper[4983]: I0316 00:09:41.092298 4983 scope.go:117] "RemoveContainer" containerID="dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9" Mar 16 00:09:42 crc kubenswrapper[4983]: I0316 00:09:42.005465 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/1.log" Mar 16 00:09:42 crc kubenswrapper[4983]: I0316 00:09:42.005560 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerStarted","Data":"1ce990601ab37c57875d72edbed61342c2686f343314ce5c6375afee78bda6da"} Mar 16 00:09:42 crc kubenswrapper[4983]: I0316 00:09:42.092569 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:42 crc kubenswrapper[4983]: E0316 00:09:42.094585 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:42 crc kubenswrapper[4983]: E0316 00:09:42.202236 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:43 crc kubenswrapper[4983]: I0316 00:09:43.092121 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:43 crc kubenswrapper[4983]: I0316 00:09:43.092176 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:43 crc kubenswrapper[4983]: E0316 00:09:43.092248 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:43 crc kubenswrapper[4983]: I0316 00:09:43.092137 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:43 crc kubenswrapper[4983]: E0316 00:09:43.092370 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:43 crc kubenswrapper[4983]: E0316 00:09:43.092606 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:44 crc kubenswrapper[4983]: I0316 00:09:44.092222 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:44 crc kubenswrapper[4983]: E0316 00:09:44.092388 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:45 crc kubenswrapper[4983]: I0316 00:09:45.091605 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:45 crc kubenswrapper[4983]: I0316 00:09:45.091683 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:45 crc kubenswrapper[4983]: I0316 00:09:45.091681 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:45 crc kubenswrapper[4983]: E0316 00:09:45.091779 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:45 crc kubenswrapper[4983]: E0316 00:09:45.091981 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:45 crc kubenswrapper[4983]: E0316 00:09:45.092052 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:46 crc kubenswrapper[4983]: I0316 00:09:46.091972 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:46 crc kubenswrapper[4983]: E0316 00:09:46.092221 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:47 crc kubenswrapper[4983]: I0316 00:09:47.092535 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:47 crc kubenswrapper[4983]: I0316 00:09:47.092615 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:47 crc kubenswrapper[4983]: E0316 00:09:47.092694 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:47 crc kubenswrapper[4983]: I0316 00:09:47.092818 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:47 crc kubenswrapper[4983]: E0316 00:09:47.092900 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:47 crc kubenswrapper[4983]: E0316 00:09:47.093042 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:48 crc kubenswrapper[4983]: I0316 00:09:48.092154 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:48 crc kubenswrapper[4983]: I0316 00:09:48.094673 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 16 00:09:48 crc kubenswrapper[4983]: I0316 00:09:48.095575 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.091947 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.092052 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.092064 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.096388 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.096888 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.097332 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.103813 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 16 00:09:53 crc kubenswrapper[4983]: I0316 00:09:53.448874 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:09:53 crc kubenswrapper[4983]: I0316 00:09:53.449808 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:09:53 crc kubenswrapper[4983]: I0316 00:09:53.496033 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.420480 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.471156 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lc9bv"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.472438 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.473297 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t4lj8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.474120 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.474911 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.475335 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.475997 4983 reflector.go:561] object-"openshift-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.476080 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.476661 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.477704 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.482354 4983 reflector.go:561] object-"openshift-apiserver"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.482446 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.482561 4983 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.482595 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.482683 4983 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.482713 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.482856 4983 reflector.go:561] object-"openshift-route-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.482891 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.482967 4983 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.483001 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.486249 4983 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.486485 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.487110 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.487359 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.488205 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-df6gg"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.488874 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v9gcl"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.488936 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.489496 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.489708 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.497452 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.498127 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.498660 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.499415 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9hbqr"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.499862 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.499993 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.500324 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.500868 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.500974 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.518393 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-fp4l5"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.519629 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.521790 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.522182 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.523175 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.523528 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.530961 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531170 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.530531 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531278 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531549 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531656 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531690 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531812 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531971 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.532027 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.532113 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.532278 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.532383 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.532492 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.535581 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.536408 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.536853 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lx4mf"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.537094 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.537154 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.537233 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.537274 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538241 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538365 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538411 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538627 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538635 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538843 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538890 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539231 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539251 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539288 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539415 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539525 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539566 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539631 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.540014 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.540236 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.541231 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.541442 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.541916 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.542090 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.542718 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.543118 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.543340 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.543391 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.543742 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.544384 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.544823 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9k8tn"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.545342 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.548630 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29560320-9tclx"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.550115 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.550261 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.550408 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.550485 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.551425 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.551602 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.552776 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.554003 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.558279 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.558666 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.562988 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.563200 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.563417 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.564278 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.564384 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.567092 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.567557 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-np9wn"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.568004 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sm6x"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.568315 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.568397 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.568697 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.569405 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.570520 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-6j9qt"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.601045 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qzvb8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.601818 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.604808 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.618916 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-w8qpq"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.624476 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.624645 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.624694 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.624920 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.624983 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.625206 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.626032 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.626152 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.626436 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.626858 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627167 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627374 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627168 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627719 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627890 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627263 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627330 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627523 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627574 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.628362 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.628443 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.628547 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.628881 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.629064 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.629708 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.632312 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.632567 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.632966 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.633088 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.633118 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.634692 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.634956 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.635407 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.635573 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.635733 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.635839 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636019 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636706 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcf712a-d77b-446c-b9e8-7083ff491d3c-serving-cert\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636737 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636780 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-config\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636797 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xzhx\" (UniqueName: \"kubernetes.io/projected/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-kube-api-access-5xzhx\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636815 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636840 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/288bbae2-d98f-4e70-8f83-314c8a7a038b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636856 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636867 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636870 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n25x\" (UniqueName: \"kubernetes.io/projected/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-kube-api-access-6n25x\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637114 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dwvz\" (UniqueName: \"kubernetes.io/projected/288bbae2-d98f-4e70-8f83-314c8a7a038b-kube-api-access-9dwvz\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637135 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637152 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637206 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-dir\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637223 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637287 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/288bbae2-d98f-4e70-8f83-314c8a7a038b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637306 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddcf712a-d77b-446c-b9e8-7083ff491d3c-config\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637346 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637367 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-service-ca-bundle\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637383 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l44hs\" (UniqueName: \"kubernetes.io/projected/ddcf712a-d77b-446c-b9e8-7083ff491d3c-kube-api-access-l44hs\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637400 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637426 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637440 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637457 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-policies\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637475 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637517 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ddcf712a-d77b-446c-b9e8-7083ff491d3c-trusted-ca\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637568 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/288bbae2-d98f-4e70-8f83-314c8a7a038b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637647 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637829 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-serving-cert\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637975 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.638358 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.638523 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.641510 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.642078 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.642484 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.643104 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.643392 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.644099 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.644964 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.645892 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.646241 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t4lj8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.647273 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.648156 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ml6pw"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.649066 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.649159 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.649804 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.650583 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.651333 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.651737 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-njztx"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.654983 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t65x6"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.655926 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.657184 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560328-sngnj"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.657608 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.657801 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.659186 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9hbqr"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.660834 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n22z7"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.662165 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.663745 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.669303 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-82r5r"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.669588 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.671704 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.675919 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.676499 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.678251 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.678674 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.679685 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.680299 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.681047 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.682199 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tj49l"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.682717 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.684143 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lc9bv"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.685188 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6j9qt"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.686449 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qzvb8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.687574 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.689071 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-df6gg"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.690225 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.691624 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.693052 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.694485 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-fp4l5"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.694738 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.695746 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.697605 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lx4mf"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.698957 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.700395 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sm6x"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.701724 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.702982 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.704077 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.705317 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29560320-9tclx"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.706371 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.707346 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v9gcl"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.710028 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9k8tn"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.711689 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.713183 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.714480 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-np9wn"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.714842 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.715446 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560328-sngnj"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.716885 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t65x6"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.718030 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.719110 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.720207 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.721342 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tj49l"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.722629 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mjkh8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.723444 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.723747 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5zxcb"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.724295 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.724887 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ml6pw"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.725849 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.726841 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.727924 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n22z7"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.729084 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-njztx"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.730190 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.731215 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.732243 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mjkh8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.733243 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5zxcb"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.734291 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738551 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-config\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738594 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xzhx\" (UniqueName: \"kubernetes.io/projected/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-kube-api-access-5xzhx\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738630 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738658 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/288bbae2-d98f-4e70-8f83-314c8a7a038b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738683 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n25x\" (UniqueName: \"kubernetes.io/projected/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-kube-api-access-6n25x\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738704 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738725 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dwvz\" (UniqueName: \"kubernetes.io/projected/288bbae2-d98f-4e70-8f83-314c8a7a038b-kube-api-access-9dwvz\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738747 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738796 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738820 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-dir\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738899 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738930 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/288bbae2-d98f-4e70-8f83-314c8a7a038b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738953 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddcf712a-d77b-446c-b9e8-7083ff491d3c-config\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738976 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-service-ca-bundle\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739003 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l44hs\" (UniqueName: \"kubernetes.io/projected/ddcf712a-d77b-446c-b9e8-7083ff491d3c-kube-api-access-l44hs\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739026 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739047 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739081 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-policies\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739104 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739125 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739147 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739165 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ddcf712a-d77b-446c-b9e8-7083ff491d3c-trusted-ca\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739181 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739196 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/288bbae2-d98f-4e70-8f83-314c8a7a038b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739219 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-serving-cert\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739236 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcf712a-d77b-446c-b9e8-7083ff491d3c-serving-cert\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739252 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739414 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-config\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.740528 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-service-ca-bundle\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.741560 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.741916 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.742110 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.742416 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddcf712a-d77b-446c-b9e8-7083ff491d3c-config\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.742483 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-dir\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.743201 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.744337 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-policies\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.744737 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.744849 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/288bbae2-d98f-4e70-8f83-314c8a7a038b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.744946 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.745864 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.746278 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/288bbae2-d98f-4e70-8f83-314c8a7a038b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.746712 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.747007 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.747064 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ddcf712a-d77b-446c-b9e8-7083ff491d3c-trusted-ca\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.747786 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.748337 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcf712a-d77b-446c-b9e8-7083ff491d3c-serving-cert\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.749088 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.749135 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.750304 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-serving-cert\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.755374 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.775288 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.805745 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.817113 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.838948 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.854286 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.875274 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.894356 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.914559 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.934553 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.955395 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.974982 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.994563 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.015147 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.034489 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.055116 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.074172 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.094499 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.114119 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.134016 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.154582 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.174995 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.194790 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.214688 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.235111 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.255434 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.294793 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.315173 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.335673 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345249 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b0e4e23-a158-4597-b005-db088a652ec8-config\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345288 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-etcd-client\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345314 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-encryption-config\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345339 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2688c073-5209-4258-a681-186370d9abcc-machine-approver-tls\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345358 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-audit-policies\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345382 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebd00ffd-95e2-47bf-a6fd-663526b2283d-trusted-ca\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345424 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a099f86-8967-4361-bbbf-4dfa8385d2f2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345453 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-config\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345480 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345507 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfpdm\" (UniqueName: \"kubernetes.io/projected/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-kube-api-access-jfpdm\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345563 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-serving-cert\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345606 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29gpp\" (UniqueName: \"kubernetes.io/projected/9c737bbb-9153-4689-bbd7-1925cd53b343-kube-api-access-29gpp\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345653 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345786 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2688c073-5209-4258-a681-186370d9abcc-config\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345825 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5kgk\" (UniqueName: \"kubernetes.io/projected/211771ed-66f1-4866-b193-5da61bbd38b4-kube-api-access-l5kgk\") pod \"downloads-7954f5f757-6j9qt\" (UID: \"211771ed-66f1-4866-b193-5da61bbd38b4\") " pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345875 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-oauth-config\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345909 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ldpp\" (UniqueName: \"kubernetes.io/projected/3b3cc32d-4d8c-47ee-bf9c-2319482ab78f-kube-api-access-6ldpp\") pod \"dns-operator-744455d44c-9k8tn\" (UID: \"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345960 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w74t6\" (UniqueName: \"kubernetes.io/projected/9f5bd50b-b197-4deb-ac50-768e3baa6cff-kube-api-access-w74t6\") pod \"image-pruner-29560320-9tclx\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.345978 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:56.845964423 +0000 UTC m=+205.446062963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346000 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-client-ca\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346059 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2688c073-5209-4258-a681-186370d9abcc-auth-proxy-config\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346080 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-audit\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346127 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c737bbb-9153-4689-bbd7-1925cd53b343-audit-dir\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346148 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54a768f3-aa53-481d-b179-5c8807f69e89-serving-cert\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346190 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zjz9\" (UniqueName: \"kubernetes.io/projected/54a768f3-aa53-481d-b179-5c8807f69e89-kube-api-access-9zjz9\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346215 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-image-import-ca\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346236 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a099f86-8967-4361-bbbf-4dfa8385d2f2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346276 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346299 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b0e4e23-a158-4597-b005-db088a652ec8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346328 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqtq5\" (UniqueName: \"kubernetes.io/projected/d76474c2-7d5c-45a0-8869-d829b0c594d6-kube-api-access-kqtq5\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346375 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzzmt\" (UniqueName: \"kubernetes.io/projected/2688c073-5209-4258-a681-186370d9abcc-kube-api-access-bzzmt\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346397 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-certificates\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346439 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-serving-ca\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346459 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-service-ca\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346510 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg86h\" (UniqueName: \"kubernetes.io/projected/8ac1b1cc-8499-493f-a8d9-801eb433163f-kube-api-access-vg86h\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346530 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346554 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346603 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-config\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346624 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-bound-sa-token\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346697 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-serving-cert\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346723 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs6fr\" (UniqueName: \"kubernetes.io/projected/bcce228b-5abb-4cbb-8f79-57326a3a9665-kube-api-access-fs6fr\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346744 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-client\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346803 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346872 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-config\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346896 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-oauth-serving-cert\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346918 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-serving-cert\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346966 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-encryption-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347002 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-client-ca\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347057 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-audit-dir\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347083 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8lpz\" (UniqueName: \"kubernetes.io/projected/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-kube-api-access-g8lpz\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347111 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebd00ffd-95e2-47bf-a6fd-663526b2283d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347138 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcce228b-5abb-4cbb-8f79-57326a3a9665-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347167 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwpkm\" (UniqueName: \"kubernetes.io/projected/ebd00ffd-95e2-47bf-a6fd-663526b2283d-kube-api-access-gwpkm\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347222 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-serving-cert\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347249 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9f5bd50b-b197-4deb-ac50-768e3baa6cff-serviceca\") pod \"image-pruner-29560320-9tclx\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347284 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcce228b-5abb-4cbb-8f79-57326a3a9665-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347310 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b0e4e23-a158-4597-b005-db088a652ec8-images\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347332 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zczdg\" (UniqueName: \"kubernetes.io/projected/6b0e4e23-a158-4597-b005-db088a652ec8-kube-api-access-zczdg\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347357 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-trusted-ca-bundle\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347424 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-node-pullsecrets\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347453 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-tls\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347477 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9n5s\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-kube-api-access-x9n5s\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347499 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347523 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347562 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-trusted-ca\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347597 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b3cc32d-4d8c-47ee-bf9c-2319482ab78f-metrics-tls\") pod \"dns-operator-744455d44c-9k8tn\" (UID: \"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347621 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ebd00ffd-95e2-47bf-a6fd-663526b2283d-metrics-tls\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.355553 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.376851 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.396101 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.415194 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.435295 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448190 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.448350 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:56.948328028 +0000 UTC m=+205.548426468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448405 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-serving-cert\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448453 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/de84a408-0f98-48c6-83a5-e6976b576989-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hqnds\" (UID: \"de84a408-0f98-48c6-83a5-e6976b576989\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448486 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/55da5246-1df8-4666-ad7c-9407719b3abb-certs\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448522 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9f5bd50b-b197-4deb-ac50-768e3baa6cff-serviceca\") pod \"image-pruner-29560320-9tclx\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448560 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e46a85-c462-4ef3-a944-6ed47d2b0598-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448597 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcce228b-5abb-4cbb-8f79-57326a3a9665-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448631 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b0e4e23-a158-4597-b005-db088a652ec8-images\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448667 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zczdg\" (UniqueName: \"kubernetes.io/projected/6b0e4e23-a158-4597-b005-db088a652ec8-kube-api-access-zczdg\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448696 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-trusted-ca-bundle\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448726 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-node-pullsecrets\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448809 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448847 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-etcd-ca\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448903 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-trusted-ca\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448962 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-socket-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448994 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b0e4e23-a158-4597-b005-db088a652ec8-config\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449003 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-node-pullsecrets\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449026 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-encryption-config\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449161 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-signing-key\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449236 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2688c073-5209-4258-a681-186370d9abcc-machine-approver-tls\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449288 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-tmpfs\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449345 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f53f35-efe7-4b1c-9a25-d82b824c156f-config\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449401 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-default-certificate\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449506 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aef72d9a-3e65-495f-8e73-ee539c10a29e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449558 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449631 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpdm\" (UniqueName: \"kubernetes.io/projected/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-kube-api-access-jfpdm\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449698 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b0e4e23-a158-4597-b005-db088a652ec8-images\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449701 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5kgk\" (UniqueName: \"kubernetes.io/projected/211771ed-66f1-4866-b193-5da61bbd38b4-kube-api-access-l5kgk\") pod \"downloads-7954f5f757-6j9qt\" (UID: \"211771ed-66f1-4866-b193-5da61bbd38b4\") " pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449791 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-config\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449830 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jwxb\" (UniqueName: \"kubernetes.io/projected/34398886-1821-47c0-bbff-951177287627-kube-api-access-9jwxb\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449854 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-client-ca\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449876 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssz5w\" (UniqueName: \"kubernetes.io/projected/31984625-3905-4d4d-9c52-e7d11c6c15d4-kube-api-access-ssz5w\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449899 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/211c2269-7173-4fcb-9403-be48b10ab364-metrics-tls\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449933 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb33b891-4cdb-4fc1-95e4-2895f40fdb7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pqgr\" (UID: \"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449972 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-audit\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450016 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6qnw\" (UniqueName: \"kubernetes.io/projected/de84a408-0f98-48c6-83a5-e6976b576989-kube-api-access-q6qnw\") pod \"package-server-manager-789f6589d5-hqnds\" (UID: \"de84a408-0f98-48c6-83a5-e6976b576989\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450038 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8w9f\" (UniqueName: \"kubernetes.io/projected/87a722ee-1078-41fd-bd5e-96981b43652d-kube-api-access-l8w9f\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450087 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54a768f3-aa53-481d-b179-5c8807f69e89-serving-cert\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450108 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zjz9\" (UniqueName: \"kubernetes.io/projected/54a768f3-aa53-481d-b179-5c8807f69e89-kube-api-access-9zjz9\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450134 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e46a85-c462-4ef3-a944-6ed47d2b0598-config\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450149 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w94hh\" (UniqueName: \"kubernetes.io/projected/33ff8ce9-2d36-4251-ae9d-802d9965bfde-kube-api-access-w94hh\") pod \"cluster-samples-operator-665b6dd947-xdst8\" (UID: \"33ff8ce9-2d36-4251-ae9d-802d9965bfde\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450165 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450181 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0153d604-68c6-465e-9714-463f0e7e4c41-secret-volume\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450200 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450225 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b0e4e23-a158-4597-b005-db088a652ec8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450297 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-signing-cabundle\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450338 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx99q\" (UniqueName: \"kubernetes.io/projected/0153d604-68c6-465e-9714-463f0e7e4c41-kube-api-access-cx99q\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450395 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg86h\" (UniqueName: \"kubernetes.io/projected/8ac1b1cc-8499-493f-a8d9-801eb433163f-kube-api-access-vg86h\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450429 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450478 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-config\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450518 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnmbs\" (UniqueName: \"kubernetes.io/projected/61000119-35ce-40ee-a8c5-5ad9052b539d-kube-api-access-fnmbs\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450552 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-apiservice-cert\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450571 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9f5bd50b-b197-4deb-ac50-768e3baa6cff-serviceca\") pod \"image-pruner-29560320-9tclx\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450604 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-bound-sa-token\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450672 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-serving-cert\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450737 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/22b9ac88-75ea-4572-bd27-f819caf4d8e2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ml6pw\" (UID: \"22b9ac88-75ea-4572-bd27-f819caf4d8e2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450878 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/34398886-1821-47c0-bbff-951177287627-etcd-client\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450956 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs6fr\" (UniqueName: \"kubernetes.io/projected/bcce228b-5abb-4cbb-8f79-57326a3a9665-kube-api-access-fs6fr\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451402 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfea0242-abc1-4912-a193-6c4dc75d9bb5-service-ca-bundle\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451475 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-config\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451539 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33ff8ce9-2d36-4251-ae9d-802d9965bfde-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xdst8\" (UID: \"33ff8ce9-2d36-4251-ae9d-802d9965bfde\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451645 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-oauth-serving-cert\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451701 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451483 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-client-ca\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451746 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-656xd\" (UniqueName: \"kubernetes.io/projected/9c413c46-e4ff-43f2-b66a-8a62e1f08890-kube-api-access-656xd\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452237 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-config\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452278 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnznm\" (UniqueName: \"kubernetes.io/projected/8820c8ae-e5d3-4c91-8724-ec666e783179-kube-api-access-tnznm\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452303 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aef72d9a-3e65-495f-8e73-ee539c10a29e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452329 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-encryption-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452353 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-client-ca\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452369 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-audit-dir\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452415 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8lpz\" (UniqueName: \"kubernetes.io/projected/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-kube-api-access-g8lpz\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452434 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs7vq\" (UniqueName: \"kubernetes.io/projected/9da42bf3-da76-4db7-9653-f2f08567084f-kube-api-access-gs7vq\") pod \"auto-csr-approver-29560328-sngnj\" (UID: \"9da42bf3-da76-4db7-9653-f2f08567084f\") " pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452455 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcce228b-5abb-4cbb-8f79-57326a3a9665-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452498 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwpkm\" (UniqueName: \"kubernetes.io/projected/ebd00ffd-95e2-47bf-a6fd-663526b2283d-kube-api-access-gwpkm\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452515 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-webhook-cert\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452533 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-stats-auth\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452817 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-audit\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453052 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-encryption-config\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453148 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b0e4e23-a158-4597-b005-db088a652ec8-config\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453226 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcce228b-5abb-4cbb-8f79-57326a3a9665-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453290 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-audit-dir\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453387 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453607 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2688c073-5209-4258-a681-186370d9abcc-machine-approver-tls\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453671 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-trusted-ca\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450853 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-trusted-ca-bundle\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.454504 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-client-ca\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.454883 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-config\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.455476 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-serving-cert\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.455612 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.456353 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-oauth-serving-cert\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.457056 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143ccd96-ced1-466f-8891-72abc221bbac-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.457153 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-mountpoint-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.457231 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcce228b-5abb-4cbb-8f79-57326a3a9665-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.457973 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54a768f3-aa53-481d-b179-5c8807f69e89-serving-cert\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.458120 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31984625-3905-4d4d-9c52-e7d11c6c15d4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.458174 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-srv-cert\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.458859 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b0e4e23-a158-4597-b005-db088a652ec8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.459210 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-encryption-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.460825 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.460995 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-registration-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.461170 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-tls\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.461396 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9n5s\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-kube-api-access-x9n5s\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.461748 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.462351 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b3cc32d-4d8c-47ee-bf9c-2319482ab78f-metrics-tls\") pod \"dns-operator-744455d44c-9k8tn\" (UID: \"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.462531 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ebd00ffd-95e2-47bf-a6fd-663526b2283d-metrics-tls\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.462666 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-etcd-client\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.463222 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnnxd\" (UniqueName: \"kubernetes.io/projected/22b9ac88-75ea-4572-bd27-f819caf4d8e2-kube-api-access-rnnxd\") pod \"multus-admission-controller-857f4d67dd-ml6pw\" (UID: \"22b9ac88-75ea-4572-bd27-f819caf4d8e2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.463445 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72hqz\" (UniqueName: \"kubernetes.io/projected/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-kube-api-access-72hqz\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.463643 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9zp5\" (UniqueName: \"kubernetes.io/projected/dfea0242-abc1-4912-a193-6c4dc75d9bb5-kube-api-access-q9zp5\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.463812 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-audit-policies\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.463983 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebd00ffd-95e2-47bf-a6fd-663526b2283d-trusted-ca\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.464149 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cp78\" (UniqueName: \"kubernetes.io/projected/55da5246-1df8-4666-ad7c-9407719b3abb-kube-api-access-9cp78\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.464317 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bspth\" (UniqueName: \"kubernetes.io/projected/b227bf69-003e-4831-8ce3-a5b1f7f85c31-kube-api-access-bspth\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.464504 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c413c46-e4ff-43f2-b66a-8a62e1f08890-cert\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.464655 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a099f86-8967-4361-bbbf-4dfa8385d2f2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.464842 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-config\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.465015 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-serving-cert\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.465178 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29gpp\" (UniqueName: \"kubernetes.io/projected/9c737bbb-9153-4689-bbd7-1925cd53b343-kube-api-access-29gpp\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.465342 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptgzs\" (UniqueName: \"kubernetes.io/projected/5373e962-abd6-4153-9cc9-7d17b9ae5fe5-kube-api-access-ptgzs\") pod \"migrator-59844c95c7-wp86n\" (UID: \"5373e962-abd6-4153-9cc9-7d17b9ae5fe5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.465574 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.465727 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2688c073-5209-4258-a681-186370d9abcc-config\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.465954 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-oauth-config\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.466041 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:56.966022046 +0000 UTC m=+205.566120486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.466240 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31984625-3905-4d4d-9c52-e7d11c6c15d4-proxy-tls\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.466360 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-config\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.466448 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2688c073-5209-4258-a681-186370d9abcc-config\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.464692 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-audit-policies\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.466566 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebd00ffd-95e2-47bf-a6fd-663526b2283d-trusted-ca\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.467028 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldpp\" (UniqueName: \"kubernetes.io/projected/3b3cc32d-4d8c-47ee-bf9c-2319482ab78f-kube-api-access-6ldpp\") pod \"dns-operator-744455d44c-9k8tn\" (UID: \"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.467183 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p57k\" (UniqueName: \"kubernetes.io/projected/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-kube-api-access-7p57k\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.467424 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2zdq\" (UniqueName: \"kubernetes.io/projected/bb33b891-4cdb-4fc1-95e4-2895f40fdb7a-kube-api-access-k2zdq\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pqgr\" (UID: \"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.467574 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-metrics-certs\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.468355 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0153d604-68c6-465e-9714-463f0e7e4c41-config-volume\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.468255 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b3cc32d-4d8c-47ee-bf9c-2319482ab78f-metrics-tls\") pod \"dns-operator-744455d44c-9k8tn\" (UID: \"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.468505 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-etcd-client\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.468702 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a099f86-8967-4361-bbbf-4dfa8385d2f2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.468941 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w74t6\" (UniqueName: \"kubernetes.io/projected/9f5bd50b-b197-4deb-ac50-768e3baa6cff-kube-api-access-w74t6\") pod \"image-pruner-29560320-9tclx\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469108 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlk8h\" (UniqueName: \"kubernetes.io/projected/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-kube-api-access-hlk8h\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469261 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2688c073-5209-4258-a681-186370d9abcc-auth-proxy-config\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469405 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34398886-1821-47c0-bbff-951177287627-serving-cert\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469553 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/61000119-35ce-40ee-a8c5-5ad9052b539d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469208 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-tls\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469858 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz6gs\" (UniqueName: \"kubernetes.io/projected/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-kube-api-access-jz6gs\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469983 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ebd00ffd-95e2-47bf-a6fd-663526b2283d-metrics-tls\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470002 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74d1b439-9506-4a1a-a1a4-3f5ca7944750-images\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470338 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c737bbb-9153-4689-bbd7-1925cd53b343-audit-dir\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470482 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470616 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-profile-collector-cert\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470394 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2688c073-5209-4258-a681-186370d9abcc-auth-proxy-config\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470892 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c737bbb-9153-4689-bbd7-1925cd53b343-audit-dir\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470923 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-image-import-ca\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471162 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f53f35-efe7-4b1c-9a25-d82b824c156f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471322 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a099f86-8967-4361-bbbf-4dfa8385d2f2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471486 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74d1b439-9506-4a1a-a1a4-3f5ca7944750-auth-proxy-config\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471650 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqtq5\" (UniqueName: \"kubernetes.io/projected/d76474c2-7d5c-45a0-8869-d829b0c594d6-kube-api-access-kqtq5\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471829 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzzmt\" (UniqueName: \"kubernetes.io/projected/2688c073-5209-4258-a681-186370d9abcc-kube-api-access-bzzmt\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471861 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a099f86-8967-4361-bbbf-4dfa8385d2f2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471996 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb9f9\" (UniqueName: \"kubernetes.io/projected/211c2269-7173-4fcb-9403-be48b10ab364-kube-api-access-zb9f9\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472225 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-certificates\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472302 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-serving-ca\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472465 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-service-ca\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472529 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-image-import-ca\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472549 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b227bf69-003e-4831-8ce3-a5b1f7f85c31-serving-cert\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472612 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472663 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8stng\" (UniqueName: \"kubernetes.io/projected/74d1b439-9506-4a1a-a1a4-3f5ca7944750-kube-api-access-8stng\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472705 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/55da5246-1df8-4666-ad7c-9407719b3abb-node-bootstrap-token\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472746 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-client\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472824 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472852 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472877 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-plugins-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472895 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e46a85-c462-4ef3-a944-6ed47d2b0598-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472912 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6f53f35-efe7-4b1c-9a25-d82b824c156f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472928 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b227bf69-003e-4831-8ce3-a5b1f7f85c31-config\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472944 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/143ccd96-ced1-466f-8891-72abc221bbac-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472953 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-serving-cert\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472962 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/143ccd96-ced1-466f-8891-72abc221bbac-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473045 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/61000119-35ce-40ee-a8c5-5ad9052b539d-srv-cert\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473082 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74d1b439-9506-4a1a-a1a4-3f5ca7944750-proxy-tls\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473126 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqlkf\" (UniqueName: \"kubernetes.io/projected/aef72d9a-3e65-495f-8e73-ee539c10a29e-kube-api-access-pqlkf\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473184 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-serving-cert\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473261 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-etcd-service-ca\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473292 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/211c2269-7173-4fcb-9403-be48b10ab364-config-volume\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473350 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebd00ffd-95e2-47bf-a6fd-663526b2283d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473381 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-csi-data-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.475150 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.475272 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.476003 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-service-ca\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.477061 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.477182 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.477319 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-certificates\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.477501 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-oauth-config\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.478082 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-serving-ca\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.482397 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-serving-cert\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.494410 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.515362 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.534332 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.554998 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.574695 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575250 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575527 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c413c46-e4ff-43f2-b66a-8a62e1f08890-cert\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.575670 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.075531205 +0000 UTC m=+205.675629665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575849 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptgzs\" (UniqueName: \"kubernetes.io/projected/5373e962-abd6-4153-9cc9-7d17b9ae5fe5-kube-api-access-ptgzs\") pod \"migrator-59844c95c7-wp86n\" (UID: \"5373e962-abd6-4153-9cc9-7d17b9ae5fe5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575883 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575910 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31984625-3905-4d4d-9c52-e7d11c6c15d4-proxy-tls\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575948 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p57k\" (UniqueName: \"kubernetes.io/projected/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-kube-api-access-7p57k\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575977 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2zdq\" (UniqueName: \"kubernetes.io/projected/bb33b891-4cdb-4fc1-95e4-2895f40fdb7a-kube-api-access-k2zdq\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pqgr\" (UID: \"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575999 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-metrics-certs\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.576018 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0153d604-68c6-465e-9714-463f0e7e4c41-config-volume\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.576054 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlk8h\" (UniqueName: \"kubernetes.io/projected/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-kube-api-access-hlk8h\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.576512 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.076495453 +0000 UTC m=+205.676593893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577095 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34398886-1821-47c0-bbff-951177287627-serving-cert\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577210 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/61000119-35ce-40ee-a8c5-5ad9052b539d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577313 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz6gs\" (UniqueName: \"kubernetes.io/projected/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-kube-api-access-jz6gs\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577420 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74d1b439-9506-4a1a-a1a4-3f5ca7944750-images\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577524 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577630 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-profile-collector-cert\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577737 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f53f35-efe7-4b1c-9a25-d82b824c156f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577860 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74d1b439-9506-4a1a-a1a4-3f5ca7944750-auth-proxy-config\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577978 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb9f9\" (UniqueName: \"kubernetes.io/projected/211c2269-7173-4fcb-9403-be48b10ab364-kube-api-access-zb9f9\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578074 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b227bf69-003e-4831-8ce3-a5b1f7f85c31-serving-cert\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578191 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8stng\" (UniqueName: \"kubernetes.io/projected/74d1b439-9506-4a1a-a1a4-3f5ca7944750-kube-api-access-8stng\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578301 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/55da5246-1df8-4666-ad7c-9407719b3abb-node-bootstrap-token\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578410 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578509 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-plugins-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578611 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e46a85-c462-4ef3-a944-6ed47d2b0598-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578915 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6f53f35-efe7-4b1c-9a25-d82b824c156f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579088 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b227bf69-003e-4831-8ce3-a5b1f7f85c31-config\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578516 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74d1b439-9506-4a1a-a1a4-3f5ca7944750-auth-proxy-config\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578942 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-plugins-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579228 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/143ccd96-ced1-466f-8891-72abc221bbac-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579412 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579428 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/143ccd96-ced1-466f-8891-72abc221bbac-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579485 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/61000119-35ce-40ee-a8c5-5ad9052b539d-srv-cert\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579510 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74d1b439-9506-4a1a-a1a4-3f5ca7944750-proxy-tls\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579563 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqlkf\" (UniqueName: \"kubernetes.io/projected/aef72d9a-3e65-495f-8e73-ee539c10a29e-kube-api-access-pqlkf\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579595 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-etcd-service-ca\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579665 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/211c2269-7173-4fcb-9403-be48b10ab364-config-volume\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579790 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-csi-data-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579833 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/de84a408-0f98-48c6-83a5-e6976b576989-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hqnds\" (UID: \"de84a408-0f98-48c6-83a5-e6976b576989\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579906 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/55da5246-1df8-4666-ad7c-9407719b3abb-certs\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579925 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-metrics-certs\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579941 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-csi-data-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579998 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e46a85-c462-4ef3-a944-6ed47d2b0598-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580194 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-etcd-ca\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580253 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-socket-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580289 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-signing-key\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580320 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-tmpfs\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580351 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f53f35-efe7-4b1c-9a25-d82b824c156f-config\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580380 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-default-certificate\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580430 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aef72d9a-3e65-495f-8e73-ee539c10a29e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580495 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-config\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580536 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-etcd-service-ca\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580539 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jwxb\" (UniqueName: \"kubernetes.io/projected/34398886-1821-47c0-bbff-951177287627-kube-api-access-9jwxb\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580611 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssz5w\" (UniqueName: \"kubernetes.io/projected/31984625-3905-4d4d-9c52-e7d11c6c15d4-kube-api-access-ssz5w\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580653 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/211c2269-7173-4fcb-9403-be48b10ab364-metrics-tls\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580694 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb33b891-4cdb-4fc1-95e4-2895f40fdb7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pqgr\" (UID: \"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580777 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6qnw\" (UniqueName: \"kubernetes.io/projected/de84a408-0f98-48c6-83a5-e6976b576989-kube-api-access-q6qnw\") pod \"package-server-manager-789f6589d5-hqnds\" (UID: \"de84a408-0f98-48c6-83a5-e6976b576989\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580816 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8w9f\" (UniqueName: \"kubernetes.io/projected/87a722ee-1078-41fd-bd5e-96981b43652d-kube-api-access-l8w9f\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580873 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e46a85-c462-4ef3-a944-6ed47d2b0598-config\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580906 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w94hh\" (UniqueName: \"kubernetes.io/projected/33ff8ce9-2d36-4251-ae9d-802d9965bfde-kube-api-access-w94hh\") pod \"cluster-samples-operator-665b6dd947-xdst8\" (UID: \"33ff8ce9-2d36-4251-ae9d-802d9965bfde\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580937 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0153d604-68c6-465e-9714-463f0e7e4c41-secret-volume\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580970 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-signing-cabundle\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.581003 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx99q\" (UniqueName: \"kubernetes.io/projected/0153d604-68c6-465e-9714-463f0e7e4c41-kube-api-access-cx99q\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.581058 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnmbs\" (UniqueName: \"kubernetes.io/projected/61000119-35ce-40ee-a8c5-5ad9052b539d-kube-api-access-fnmbs\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.581089 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-apiservice-cert\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.581310 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-tmpfs\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.581974 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-socket-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582255 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e46a85-c462-4ef3-a944-6ed47d2b0598-config\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582259 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aef72d9a-3e65-495f-8e73-ee539c10a29e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582366 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/22b9ac88-75ea-4572-bd27-f819caf4d8e2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ml6pw\" (UID: \"22b9ac88-75ea-4572-bd27-f819caf4d8e2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582383 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-etcd-ca\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582405 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/34398886-1821-47c0-bbff-951177287627-etcd-client\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582434 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-config\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582452 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfea0242-abc1-4912-a193-6c4dc75d9bb5-service-ca-bundle\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582488 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33ff8ce9-2d36-4251-ae9d-802d9965bfde-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xdst8\" (UID: \"33ff8ce9-2d36-4251-ae9d-802d9965bfde\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582525 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582558 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-656xd\" (UniqueName: \"kubernetes.io/projected/9c413c46-e4ff-43f2-b66a-8a62e1f08890-kube-api-access-656xd\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582589 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnznm\" (UniqueName: \"kubernetes.io/projected/8820c8ae-e5d3-4c91-8724-ec666e783179-kube-api-access-tnznm\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582618 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aef72d9a-3e65-495f-8e73-ee539c10a29e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582661 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs7vq\" (UniqueName: \"kubernetes.io/projected/9da42bf3-da76-4db7-9653-f2f08567084f-kube-api-access-gs7vq\") pod \"auto-csr-approver-29560328-sngnj\" (UID: \"9da42bf3-da76-4db7-9653-f2f08567084f\") " pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582703 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-webhook-cert\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582735 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-stats-auth\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582810 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143ccd96-ced1-466f-8891-72abc221bbac-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582842 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-mountpoint-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582887 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31984625-3905-4d4d-9c52-e7d11c6c15d4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582918 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-srv-cert\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582944 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-registration-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583030 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583068 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnnxd\" (UniqueName: \"kubernetes.io/projected/22b9ac88-75ea-4572-bd27-f819caf4d8e2-kube-api-access-rnnxd\") pod \"multus-admission-controller-857f4d67dd-ml6pw\" (UID: \"22b9ac88-75ea-4572-bd27-f819caf4d8e2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583098 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72hqz\" (UniqueName: \"kubernetes.io/projected/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-kube-api-access-72hqz\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583132 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9zp5\" (UniqueName: \"kubernetes.io/projected/dfea0242-abc1-4912-a193-6c4dc75d9bb5-kube-api-access-q9zp5\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583164 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cp78\" (UniqueName: \"kubernetes.io/projected/55da5246-1df8-4666-ad7c-9407719b3abb-kube-api-access-9cp78\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583186 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bspth\" (UniqueName: \"kubernetes.io/projected/b227bf69-003e-4831-8ce3-a5b1f7f85c31-kube-api-access-bspth\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583485 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfea0242-abc1-4912-a193-6c4dc75d9bb5-service-ca-bundle\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583504 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/143ccd96-ced1-466f-8891-72abc221bbac-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583737 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-mountpoint-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.584230 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-registration-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.584356 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143ccd96-ced1-466f-8891-72abc221bbac-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.584867 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31984625-3905-4d4d-9c52-e7d11c6c15d4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.585421 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb33b891-4cdb-4fc1-95e4-2895f40fdb7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pqgr\" (UID: \"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.585468 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34398886-1821-47c0-bbff-951177287627-serving-cert\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.585884 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e46a85-c462-4ef3-a944-6ed47d2b0598-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.586692 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33ff8ce9-2d36-4251-ae9d-802d9965bfde-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xdst8\" (UID: \"33ff8ce9-2d36-4251-ae9d-802d9965bfde\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.586743 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-default-certificate\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.587404 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/34398886-1821-47c0-bbff-951177287627-etcd-client\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.587694 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aef72d9a-3e65-495f-8e73-ee539c10a29e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.588090 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.588626 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-stats-auth\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.594993 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.614974 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.621405 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f53f35-efe7-4b1c-9a25-d82b824c156f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.638594 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.653414 4983 request.go:700] Waited for 1.009011976s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/secrets?fieldSelector=metadata.name%3Dkube-controller-manager-operator-dockercfg-gkqpw&limit=500&resourceVersion=0 Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.655157 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.674538 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.682309 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f53f35-efe7-4b1c-9a25-d82b824c156f-config\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.684297 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.684596 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.184561669 +0000 UTC m=+205.784660169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.685962 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.686462 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.186442785 +0000 UTC m=+205.786541255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.695021 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.702327 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31984625-3905-4d4d-9c52-e7d11c6c15d4-proxy-tls\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.715814 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.735547 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.747440 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/22b9ac88-75ea-4572-bd27-f819caf4d8e2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ml6pw\" (UID: \"22b9ac88-75ea-4572-bd27-f819caf4d8e2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.754947 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.775874 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.787395 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.787531 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.287503721 +0000 UTC m=+205.887602191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.787813 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.788144 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.28812907 +0000 UTC m=+205.888227500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.796313 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.815608 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.825574 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/de84a408-0f98-48c6-83a5-e6976b576989-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hqnds\" (UID: \"de84a408-0f98-48c6-83a5-e6976b576989\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.834727 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.855882 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.867259 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-srv-cert\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.876292 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.881121 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/61000119-35ce-40ee-a8c5-5ad9052b539d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.882095 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-profile-collector-cert\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.885672 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0153d604-68c6-465e-9714-463f0e7e4c41-secret-volume\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.889739 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.890015 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.389991271 +0000 UTC m=+205.990089701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.890253 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.890610 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.390601419 +0000 UTC m=+205.990699849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.894452 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.898725 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74d1b439-9506-4a1a-a1a4-3f5ca7944750-images\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.915421 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.922470 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74d1b439-9506-4a1a-a1a4-3f5ca7944750-proxy-tls\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.934238 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.955618 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.975009 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.993189 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.994228 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.494213251 +0000 UTC m=+206.094311681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.995575 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.014944 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.024933 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-signing-key\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.035566 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.043499 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-signing-cabundle\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.056329 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.075717 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.096249 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.096932 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.596909837 +0000 UTC m=+206.197008307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.114732 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.134857 4983 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.154969 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.175404 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.185281 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/61000119-35ce-40ee-a8c5-5ad9052b539d-srv-cert\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.194964 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.197530 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.197733 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.697704445 +0000 UTC m=+206.297802885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.198417 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.198921 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.698910391 +0000 UTC m=+206.299008831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.203585 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/55da5246-1df8-4666-ad7c-9407719b3abb-node-bootstrap-token\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.215005 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.224946 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/55da5246-1df8-4666-ad7c-9407719b3abb-certs\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.235307 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.254622 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.266598 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-apiservice-cert\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.268407 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-webhook-cert\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.274413 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.281183 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b227bf69-003e-4831-8ce3-a5b1f7f85c31-serving-cert\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.294466 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.299567 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.300366 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.800289897 +0000 UTC m=+206.400388367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.300620 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.301245 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.801227535 +0000 UTC m=+206.401325975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.314401 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.333954 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.341046 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b227bf69-003e-4831-8ce3-a5b1f7f85c31-config\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.355001 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.374968 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.394231 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.397113 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0153d604-68c6-465e-9714-463f0e7e4c41-config-volume\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.401845 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.401975 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.901956851 +0000 UTC m=+206.502055281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.402622 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.402979 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.902971352 +0000 UTC m=+206.503069782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.421720 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.445545 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.450600 4983 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.450724 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert podName:8ac1b1cc-8499-493f-a8d9-801eb433163f nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.950695256 +0000 UTC m=+206.550793726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert") pod "route-controller-manager-6576b87f9c-rvjb2" (UID: "8ac1b1cc-8499-493f-a8d9-801eb433163f") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.451747 4983 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.451922 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-serving-cert podName:249f0516-0237-4ba3-92eb-a7aa3b9c62c1 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.951885552 +0000 UTC m=+206.551984032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-serving-cert") pod "apiserver-76f77b778f-lc9bv" (UID: "249f0516-0237-4ba3-92eb-a7aa3b9c62c1") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.455054 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.455921 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.458614 4983 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.458701 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-config podName:249f0516-0237-4ba3-92eb-a7aa3b9c62c1 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.958682104 +0000 UTC m=+206.558780554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-config") pod "apiserver-76f77b778f-lc9bv" (UID: "249f0516-0237-4ba3-92eb-a7aa3b9c62c1") : failed to sync configmap cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.463498 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.473381 4983 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.473501 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-client podName:249f0516-0237-4ba3-92eb-a7aa3b9c62c1 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.973467936 +0000 UTC m=+206.573566576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-client") pod "apiserver-76f77b778f-lc9bv" (UID: "249f0516-0237-4ba3-92eb-a7aa3b9c62c1") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.475150 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.495864 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.505623 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.505865 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.005832542 +0000 UTC m=+206.605931012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.506243 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.506721 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.006704608 +0000 UTC m=+206.606803078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.514556 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.523492 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/211c2269-7173-4fcb-9403-be48b10ab364-config-volume\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.537325 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.554994 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.565501 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/211c2269-7173-4fcb-9403-be48b10ab364-metrics-tls\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.575289 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.575725 4983 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.575922 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c413c46-e4ff-43f2-b66a-8a62e1f08890-cert podName:9c413c46-e4ff-43f2-b66a-8a62e1f08890 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.075898133 +0000 UTC m=+206.675996593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9c413c46-e4ff-43f2-b66a-8a62e1f08890-cert") pod "ingress-canary-5zxcb" (UID: "9c413c46-e4ff-43f2-b66a-8a62e1f08890") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.595287 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.607276 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.607552 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.107503216 +0000 UTC m=+206.707601656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.608516 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.609728 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.10963691 +0000 UTC m=+206.709735490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.616305 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.637488 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.673146 4983 request.go:700] Waited for 1.931990934s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.683913 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xzhx\" (UniqueName: \"kubernetes.io/projected/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-kube-api-access-5xzhx\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.696432 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n25x\" (UniqueName: \"kubernetes.io/projected/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-kube-api-access-6n25x\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.710342 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.710575 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.210541222 +0000 UTC m=+206.810639672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.710787 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.711179 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.21115631 +0000 UTC m=+206.811254780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.715199 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/288bbae2-d98f-4e70-8f83-314c8a7a038b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.742146 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dwvz\" (UniqueName: \"kubernetes.io/projected/288bbae2-d98f-4e70-8f83-314c8a7a038b-kube-api-access-9dwvz\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.748616 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.754822 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.758902 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l44hs\" (UniqueName: \"kubernetes.io/projected/ddcf712a-d77b-446c-b9e8-7083ff491d3c-kube-api-access-l44hs\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.761821 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.799563 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.804603 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.812630 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5kgk\" (UniqueName: \"kubernetes.io/projected/211771ed-66f1-4866-b193-5da61bbd38b4-kube-api-access-l5kgk\") pod \"downloads-7954f5f757-6j9qt\" (UID: \"211771ed-66f1-4866-b193-5da61bbd38b4\") " pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.812793 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.812965 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.312940638 +0000 UTC m=+206.913039098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.813459 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.813849 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.313832405 +0000 UTC m=+206.913930845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.833063 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfpdm\" (UniqueName: \"kubernetes.io/projected/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-kube-api-access-jfpdm\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.866464 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.876707 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg86h\" (UniqueName: \"kubernetes.io/projected/8ac1b1cc-8499-493f-a8d9-801eb433163f-kube-api-access-vg86h\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.878936 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.882125 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-bound-sa-token\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.929920 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.936100 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.436074623 +0000 UTC m=+207.036173063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.936623 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zjz9\" (UniqueName: \"kubernetes.io/projected/54a768f3-aa53-481d-b179-5c8807f69e89-kube-api-access-9zjz9\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.938552 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs6fr\" (UniqueName: \"kubernetes.io/projected/bcce228b-5abb-4cbb-8f79-57326a3a9665-kube-api-access-fs6fr\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.952850 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8lpz\" (UniqueName: \"kubernetes.io/projected/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-kube-api-access-g8lpz\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.953548 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwpkm\" (UniqueName: \"kubernetes.io/projected/ebd00ffd-95e2-47bf-a6fd-663526b2283d-kube-api-access-gwpkm\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.970205 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9n5s\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-kube-api-access-x9n5s\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.995371 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29gpp\" (UniqueName: \"kubernetes.io/projected/9c737bbb-9153-4689-bbd7-1925cd53b343-kube-api-access-29gpp\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.018794 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ldpp\" (UniqueName: \"kubernetes.io/projected/3b3cc32d-4d8c-47ee-bf9c-2319482ab78f-kube-api-access-6ldpp\") pod \"dns-operator-744455d44c-9k8tn\" (UID: \"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.036738 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.036882 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-serving-cert\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.036954 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.037050 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.037136 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-client\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.037511 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.53749283 +0000 UTC m=+207.137591260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.039455 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w74t6\" (UniqueName: \"kubernetes.io/projected/9f5bd50b-b197-4deb-ac50-768e3baa6cff-kube-api-access-w74t6\") pod \"image-pruner-29560320-9tclx\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.054303 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqtq5\" (UniqueName: \"kubernetes.io/projected/d76474c2-7d5c-45a0-8869-d829b0c594d6-kube-api-access-kqtq5\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.078214 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzzmt\" (UniqueName: \"kubernetes.io/projected/2688c073-5209-4258-a681-186370d9abcc-kube-api-access-bzzmt\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.082052 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.086029 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.094092 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebd00ffd-95e2-47bf-a6fd-663526b2283d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.095655 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.095839 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.099708 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.112213 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.119883 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v9gcl"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.123427 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.126379 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.131973 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.133293 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptgzs\" (UniqueName: \"kubernetes.io/projected/5373e962-abd6-4153-9cc9-7d17b9ae5fe5-kube-api-access-ptgzs\") pod \"migrator-59844c95c7-wp86n\" (UID: \"5373e962-abd6-4153-9cc9-7d17b9ae5fe5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.138446 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.138640 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c413c46-e4ff-43f2-b66a-8a62e1f08890-cert\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.142947 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.642916847 +0000 UTC m=+207.243015277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.145981 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c413c46-e4ff-43f2-b66a-8a62e1f08890-cert\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.150690 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p57k\" (UniqueName: \"kubernetes.io/projected/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-kube-api-access-7p57k\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.159727 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-df6gg"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.187179 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lx4mf"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.191099 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.205857 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlk8h\" (UniqueName: \"kubernetes.io/projected/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-kube-api-access-hlk8h\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.217316 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz6gs\" (UniqueName: \"kubernetes.io/projected/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-kube-api-access-jz6gs\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.224436 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6j9qt"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.229745 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb9f9\" (UniqueName: \"kubernetes.io/projected/211c2269-7173-4fcb-9403-be48b10ab364-kube-api-access-zb9f9\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.239603 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.239890 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.739879781 +0000 UTC m=+207.339978211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.252340 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.257650 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8stng\" (UniqueName: \"kubernetes.io/projected/74d1b439-9506-4a1a-a1a4-3f5ca7944750-kube-api-access-8stng\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.267888 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod211771ed_66f1_4866_b193_5da61bbd38b4.slice/crio-4fbf47dbb17c4a5210d7461c6dede8a14a1f876ce7c9d7841cd12a9740c1ad64 WatchSource:0}: Error finding container 4fbf47dbb17c4a5210d7461c6dede8a14a1f876ce7c9d7841cd12a9740c1ad64: Status 404 returned error can't find the container with id 4fbf47dbb17c4a5210d7461c6dede8a14a1f876ce7c9d7841cd12a9740c1ad64 Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.274873 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6f53f35-efe7-4b1c-9a25-d82b824c156f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.275140 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.284903 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.288336 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-np9wn"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.292477 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqlkf\" (UniqueName: \"kubernetes.io/projected/aef72d9a-3e65-495f-8e73-ee539c10a29e-kube-api-access-pqlkf\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.313240 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e46a85-c462-4ef3-a944-6ed47d2b0598-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.330862 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.333195 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jwxb\" (UniqueName: \"kubernetes.io/projected/34398886-1821-47c0-bbff-951177287627-kube-api-access-9jwxb\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.340487 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.341095 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.84108119 +0000 UTC m=+207.441179610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.341116 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9hbqr"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.342013 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.349480 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssz5w\" (UniqueName: \"kubernetes.io/projected/31984625-3905-4d4d-9c52-e7d11c6c15d4-kube-api-access-ssz5w\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.358600 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.371747 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w94hh\" (UniqueName: \"kubernetes.io/projected/33ff8ce9-2d36-4251-ae9d-802d9965bfde-kube-api-access-w94hh\") pod \"cluster-samples-operator-665b6dd947-xdst8\" (UID: \"33ff8ce9-2d36-4251-ae9d-802d9965bfde\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.377266 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-fp4l5"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.392398 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6qnw\" (UniqueName: \"kubernetes.io/projected/de84a408-0f98-48c6-83a5-e6976b576989-kube-api-access-q6qnw\") pod \"package-server-manager-789f6589d5-hqnds\" (UID: \"de84a408-0f98-48c6-83a5-e6976b576989\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.393267 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54a768f3_aa53_481d_b179_5c8807f69e89.slice/crio-b3aab4335c7ccfdf2161f33d50cd419255438398c3dcb373c9fee2523012c9eb WatchSource:0}: Error finding container b3aab4335c7ccfdf2161f33d50cd419255438398c3dcb373c9fee2523012c9eb: Status 404 returned error can't find the container with id b3aab4335c7ccfdf2161f33d50cd419255438398c3dcb373c9fee2523012c9eb Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.397383 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd76474c2_7d5c_45a0_8869_d829b0c594d6.slice/crio-b3f67d7e32c58857716cb5db5b5ff53b1420e134f3f9b090f505ee73349d30f5 WatchSource:0}: Error finding container b3f67d7e32c58857716cb5db5b5ff53b1420e134f3f9b090f505ee73349d30f5: Status 404 returned error can't find the container with id b3f67d7e32c58857716cb5db5b5ff53b1420e134f3f9b090f505ee73349d30f5 Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.399389 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.418026 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8w9f\" (UniqueName: \"kubernetes.io/projected/87a722ee-1078-41fd-bd5e-96981b43652d-kube-api-access-l8w9f\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.429236 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.433345 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx99q\" (UniqueName: \"kubernetes.io/projected/0153d604-68c6-465e-9714-463f0e7e4c41-kube-api-access-cx99q\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.438291 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.444873 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.445222 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.945210318 +0000 UTC m=+207.545308748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.451299 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnmbs\" (UniqueName: \"kubernetes.io/projected/61000119-35ce-40ee-a8c5-5ad9052b539d-kube-api-access-fnmbs\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.471987 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnznm\" (UniqueName: \"kubernetes.io/projected/8820c8ae-e5d3-4c91-8724-ec666e783179-kube-api-access-tnznm\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.472247 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.485819 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.489230 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bspth\" (UniqueName: \"kubernetes.io/projected/b227bf69-003e-4831-8ce3-a5b1f7f85c31-kube-api-access-bspth\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.491270 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.521033 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29560320-9tclx"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.524990 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs7vq\" (UniqueName: \"kubernetes.io/projected/9da42bf3-da76-4db7-9653-f2f08567084f-kube-api-access-gs7vq\") pod \"auto-csr-approver-29560328-sngnj\" (UID: \"9da42bf3-da76-4db7-9653-f2f08567084f\") " pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.530489 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.536895 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-656xd\" (UniqueName: \"kubernetes.io/projected/9c413c46-e4ff-43f2-b66a-8a62e1f08890-kube-api-access-656xd\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.546383 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.546580 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.046554563 +0000 UTC m=+207.646653003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.546841 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.547216 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.047205003 +0000 UTC m=+207.647303483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.549953 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnnxd\" (UniqueName: \"kubernetes.io/projected/22b9ac88-75ea-4572-bd27-f819caf4d8e2-kube-api-access-rnnxd\") pod \"multus-admission-controller-857f4d67dd-ml6pw\" (UID: \"22b9ac88-75ea-4572-bd27-f819caf4d8e2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.563014 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.568099 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.570338 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9zp5\" (UniqueName: \"kubernetes.io/projected/dfea0242-abc1-4912-a193-6c4dc75d9bb5-kube-api-access-q9zp5\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.596697 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.598348 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72hqz\" (UniqueName: \"kubernetes.io/projected/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-kube-api-access-72hqz\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.613400 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.616094 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.617742 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cp78\" (UniqueName: \"kubernetes.io/projected/55da5246-1df8-4666-ad7c-9407719b3abb-kube-api-access-9cp78\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.627563 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.629997 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9k8tn"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.631244 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/143ccd96-ced1-466f-8891-72abc221bbac-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.638417 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.645742 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.646173 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.651567 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.651982 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.151966539 +0000 UTC m=+207.752064969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.654530 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.654581 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.656698 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-client\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.662062 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.662274 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.665889 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.673844 4983 request.go:700] Waited for 1.840483423s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.675307 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.679074 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zczdg\" (UniqueName: \"kubernetes.io/projected/6b0e4e23-a158-4597-b005-db088a652ec8-kube-api-access-zczdg\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.684255 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2zdq\" (UniqueName: \"kubernetes.io/projected/bb33b891-4cdb-4fc1-95e4-2895f40fdb7a-kube-api-access-k2zdq\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pqgr\" (UID: \"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.684299 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.691798 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.696152 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.700397 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-serving-cert\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.707275 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.707786 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f5bd50b_b197_4deb_ac50_768e3baa6cff.slice/crio-de3be632dc1110954b83a945d8663b11e01d0d75623e9f6802f42d930bdec5ce WatchSource:0}: Error finding container de3be632dc1110954b83a945d8663b11e01d0d75623e9f6802f42d930bdec5ce: Status 404 returned error can't find the container with id de3be632dc1110954b83a945d8663b11e01d0d75623e9f6802f42d930bdec5ce Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.715203 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.716229 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.721000 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t65x6"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.763093 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.763996 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.766271 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.767604 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.26759008 +0000 UTC m=+207.867688510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.784426 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.800599 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-njztx"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.803297 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.812653 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b3cc32d_4d8c_47ee_bf9c_2319482ab78f.slice/crio-213fa260d6d2ab1c7363915f32ad9c9d84d0c738700cc118181e99da7f885c08 WatchSource:0}: Error finding container 213fa260d6d2ab1c7363915f32ad9c9d84d0c738700cc118181e99da7f885c08: Status 404 returned error can't find the container with id 213fa260d6d2ab1c7363915f32ad9c9d84d0c738700cc118181e99da7f885c08 Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.817232 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcce228b_5abb_4cbb_8f79_57326a3a9665.slice/crio-3ec71c3d2428b1483823a956e0140519b9b4d56c490cd7d07c693e2bdd3ce553 WatchSource:0}: Error finding container 3ec71c3d2428b1483823a956e0140519b9b4d56c490cd7d07c693e2bdd3ce553: Status 404 returned error can't find the container with id 3ec71c3d2428b1483823a956e0140519b9b4d56c490cd7d07c693e2bdd3ce553 Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.818001 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.819188 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6f53f35_efe7_4b1c_9a25_d82b824c156f.slice/crio-8731e4e180ee2daf698749d84e99c4ed8cdffe0f0fdfed703a8d018531ed2b36 WatchSource:0}: Error finding container 8731e4e180ee2daf698749d84e99c4ed8cdffe0f0fdfed703a8d018531ed2b36: Status 404 returned error can't find the container with id 8731e4e180ee2daf698749d84e99c4ed8cdffe0f0fdfed703a8d018531ed2b36 Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.825440 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e96ad5_bed1_4cf2_acf0_7f61294d16a7.slice/crio-a03594a8fe18f1d17dcd038ab4c4d231f097f012bb6b478888b19411d2d3e440 WatchSource:0}: Error finding container a03594a8fe18f1d17dcd038ab4c4d231f097f012bb6b478888b19411d2d3e440: Status 404 returned error can't find the container with id a03594a8fe18f1d17dcd038ab4c4d231f097f012bb6b478888b19411d2d3e440 Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.837673 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.841709 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.844388 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dee21fa_f6c7_4ef6_a99d_21ad42acd3e1.slice/crio-d65927f74531b4ae7237f192b19f24d9b771cf72d783cc16be3b30d05db2b676 WatchSource:0}: Error finding container d65927f74531b4ae7237f192b19f24d9b771cf72d783cc16be3b30d05db2b676: Status 404 returned error can't find the container with id d65927f74531b4ae7237f192b19f24d9b771cf72d783cc16be3b30d05db2b676 Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.848411 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8662dd30_6a4c_4a3d_a3bb_8d24821241fa.slice/crio-267e1c669f5b6be78d2c5d428fe7e487b5235740a5be9f3a8c5313d3ed3e5b8a WatchSource:0}: Error finding container 267e1c669f5b6be78d2c5d428fe7e487b5235740a5be9f3a8c5313d3ed3e5b8a: Status 404 returned error can't find the container with id 267e1c669f5b6be78d2c5d428fe7e487b5235740a5be9f3a8c5313d3ed3e5b8a Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.851820 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.855220 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74d1b439_9506_4a1a_a1a4_3f5ca7944750.slice/crio-7773e05a45c023dfbaee250133ba3c8509f1eb88ed482e7416f7b66f9be6f92b WatchSource:0}: Error finding container 7773e05a45c023dfbaee250133ba3c8509f1eb88ed482e7416f7b66f9be6f92b: Status 404 returned error can't find the container with id 7773e05a45c023dfbaee250133ba3c8509f1eb88ed482e7416f7b66f9be6f92b Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.867017 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2688c073_5209_4258_a681_186370d9abcc.slice/crio-0514edf5057782dc147fcbf25c2e071a12af4f2b549922f7163f1f3bc0edaa4e WatchSource:0}: Error finding container 0514edf5057782dc147fcbf25c2e071a12af4f2b549922f7163f1f3bc0edaa4e: Status 404 returned error can't find the container with id 0514edf5057782dc147fcbf25c2e071a12af4f2b549922f7163f1f3bc0edaa4e Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.868165 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.868509 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.368494782 +0000 UTC m=+207.968593212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.868582 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.973549 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.973891 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.473877018 +0000 UTC m=+208.073975448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.075200 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.075656 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.575636565 +0000 UTC m=+208.175735005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.081735 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.083983 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mjkh8"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.085339 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" event={"ID":"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7","Type":"ContainerStarted","Data":"b0a22b729384eceaec70338f2664d98b4fd349a61381f59e8d8000d93e6ea45c"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.085377 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" event={"ID":"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7","Type":"ContainerStarted","Data":"868596c93911bc2ce0d03c99cb54a64d4a75b2790c09c5e6365f25e02d16c389"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.088691 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" event={"ID":"8662dd30-6a4c-4a3d-a3bb-8d24821241fa","Type":"ContainerStarted","Data":"267e1c669f5b6be78d2c5d428fe7e487b5235740a5be9f3a8c5313d3ed3e5b8a"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.103941 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" event={"ID":"ddcf712a-d77b-446c-b9e8-7083ff491d3c","Type":"ContainerStarted","Data":"d2688ca4b6c5c707a80fb06943ee2c21b4f8c5dd00db6134d8d4f77d5c364e05"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.103973 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" event={"ID":"ddcf712a-d77b-446c-b9e8-7083ff491d3c","Type":"ContainerStarted","Data":"d0961189be900e1c2dcda31be599fd30f44777d7f3a5703f8d03618f8973bb05"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.103989 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.117134 4983 patch_prober.go:28] interesting pod/console-operator-58897d9998-lx4mf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.117176 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" podUID="ddcf712a-d77b-446c-b9e8-7083ff491d3c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.122588 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" event={"ID":"bcce228b-5abb-4cbb-8f79-57326a3a9665","Type":"ContainerStarted","Data":"3ec71c3d2428b1483823a956e0140519b9b4d56c490cd7d07c693e2bdd3ce553"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.128138 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" event={"ID":"ebd00ffd-95e2-47bf-a6fd-663526b2283d","Type":"ContainerStarted","Data":"5162cab02d86fc9683e9aa9db705e7180d844cda571df9dbd03bb411eb3f2b8c"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.129289 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" event={"ID":"9c737bbb-9153-4689-bbd7-1925cd53b343","Type":"ContainerStarted","Data":"3a36dd48ac206ec21f4409f01f12556f1ffcb5a523b2d58864e8c8059ab57fb6"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.130516 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" event={"ID":"0fd829d1-ad38-407e-a576-43aa5a6ca8f2","Type":"ContainerStarted","Data":"5ab76987f0d86f28eac9406e16f1acebdbf300a37f32a1aa45d218eb2af1f3e4"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.130552 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" event={"ID":"0fd829d1-ad38-407e-a576-43aa5a6ca8f2","Type":"ContainerStarted","Data":"992aee5b0776d510c59718dbe65f51126e10a5ddde1021826a4cd33845179277"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.130921 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.131298 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" event={"ID":"2688c073-5209-4258-a681-186370d9abcc","Type":"ContainerStarted","Data":"0514edf5057782dc147fcbf25c2e071a12af4f2b549922f7163f1f3bc0edaa4e"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.132421 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" event={"ID":"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc","Type":"ContainerStarted","Data":"cebb292829ee5a803948ac9659b48660b8fe5edfe2747823cec43da8907f3802"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.132444 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" event={"ID":"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc","Type":"ContainerStarted","Data":"7492fef3c4c1f293e07fe42f7d3a7b16a15efd858458425883b9d912d63b20b8"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.134851 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" event={"ID":"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1","Type":"ContainerStarted","Data":"d65927f74531b4ae7237f192b19f24d9b771cf72d783cc16be3b30d05db2b676"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.135631 4983 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-df6gg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.135686 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.143948 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fp4l5" event={"ID":"d76474c2-7d5c-45a0-8869-d829b0c594d6","Type":"ContainerStarted","Data":"b3f67d7e32c58857716cb5db5b5ff53b1420e134f3f9b090f505ee73349d30f5"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.146072 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" event={"ID":"53e96ad5-bed1-4cf2-acf0-7f61294d16a7","Type":"ContainerStarted","Data":"a03594a8fe18f1d17dcd038ab4c4d231f097f012bb6b478888b19411d2d3e440"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.146663 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" event={"ID":"b6f53f35-efe7-4b1c-9a25-d82b824c156f","Type":"ContainerStarted","Data":"8731e4e180ee2daf698749d84e99c4ed8cdffe0f0fdfed703a8d018531ed2b36"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.150634 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" event={"ID":"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f","Type":"ContainerStarted","Data":"213fa260d6d2ab1c7363915f32ad9c9d84d0c738700cc118181e99da7f885c08"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.154108 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-9tclx" event={"ID":"9f5bd50b-b197-4deb-ac50-768e3baa6cff","Type":"ContainerStarted","Data":"de3be632dc1110954b83a945d8663b11e01d0d75623e9f6802f42d930bdec5ce"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.155346 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" event={"ID":"74d1b439-9506-4a1a-a1a4-3f5ca7944750","Type":"ContainerStarted","Data":"7773e05a45c023dfbaee250133ba3c8509f1eb88ed482e7416f7b66f9be6f92b"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.157380 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6j9qt" event={"ID":"211771ed-66f1-4866-b193-5da61bbd38b4","Type":"ContainerStarted","Data":"5c8492fec88f4b13618a9dbd6bd6da1904f6e125f5f134be0db02f38c23179ca"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.157398 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6j9qt" event={"ID":"211771ed-66f1-4866-b193-5da61bbd38b4","Type":"ContainerStarted","Data":"4fbf47dbb17c4a5210d7461c6dede8a14a1f876ce7c9d7841cd12a9740c1ad64"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.157986 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.159393 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" event={"ID":"288bbae2-d98f-4e70-8f83-314c8a7a038b","Type":"ContainerStarted","Data":"73989f1f0cc2f3779005a964d470c9e62f00da4ba85bed4bbf78f1448df2d8aa"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.159412 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" event={"ID":"288bbae2-d98f-4e70-8f83-314c8a7a038b","Type":"ContainerStarted","Data":"90b48e07a8682d9e2d89981f9b4eca33656e2c6cd4e2fbb06d5f3e86a1ff6df3"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.162027 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.162051 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.162211 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" event={"ID":"5373e962-abd6-4153-9cc9-7d17b9ae5fe5","Type":"ContainerStarted","Data":"e67fd58081418d2850b7c6ee984c0827591c97f3998cc0c096962e658294485a"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.164172 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" event={"ID":"54a768f3-aa53-481d-b179-5c8807f69e89","Type":"ContainerStarted","Data":"b3aab4335c7ccfdf2161f33d50cd419255438398c3dcb373c9fee2523012c9eb"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.164887 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.171540 4983 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9hbqr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.171592 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.176422 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.177917 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.677904367 +0000 UTC m=+208.278002797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.242670 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.277348 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.278156 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.77783689 +0000 UTC m=+208.377935320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.278348 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.279228 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.779212711 +0000 UTC m=+208.379311231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.348542 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tj49l"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.381478 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.382021 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.882003169 +0000 UTC m=+208.482101599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.382045 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.420817 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.433499 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qzvb8"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.437297 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.482179 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" podStartSLOduration=160.482162868 podStartE2EDuration="2m40.482162868s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:59.447891636 +0000 UTC m=+208.047990056" watchObservedRunningTime="2026-03-16 00:09:59.482162868 +0000 UTC m=+208.082261298" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.482605 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.482874 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.982862799 +0000 UTC m=+208.582961229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.537366 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" podStartSLOduration=161.537348546 podStartE2EDuration="2m41.537348546s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:59.535796989 +0000 UTC m=+208.135895429" watchObservedRunningTime="2026-03-16 00:09:59.537348546 +0000 UTC m=+208.137446976" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.588312 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.588616 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.088601825 +0000 UTC m=+208.688700245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: W0316 00:09:59.627056 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94e46a85_c462_4ef3_a944_6ed47d2b0598.slice/crio-a857e30f706d81c0ba7ca15373135d1e1ab1827dd2b9bc97b0f5352aef4c79ae WatchSource:0}: Error finding container a857e30f706d81c0ba7ca15373135d1e1ab1827dd2b9bc97b0f5352aef4c79ae: Status 404 returned error can't find the container with id a857e30f706d81c0ba7ca15373135d1e1ab1827dd2b9bc97b0f5352aef4c79ae Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.645126 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" podStartSLOduration=160.645104122 podStartE2EDuration="2m40.645104122s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:59.607956303 +0000 UTC m=+208.208054743" watchObservedRunningTime="2026-03-16 00:09:59.645104122 +0000 UTC m=+208.245202552" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.689404 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.689678 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.189667982 +0000 UTC m=+208.789766412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.730911 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" podStartSLOduration=160.730892632 podStartE2EDuration="2m40.730892632s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:59.725265184 +0000 UTC m=+208.325363614" watchObservedRunningTime="2026-03-16 00:09:59.730892632 +0000 UTC m=+208.330991062" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.790910 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.791245 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.291219463 +0000 UTC m=+208.891317893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.843068 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.843118 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49"] Mar 16 00:09:59 crc kubenswrapper[4983]: W0316 00:09:59.890549 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod143ccd96_ced1_466f_8891_72abc221bbac.slice/crio-dbdeebfa567437983439c92b15646caa142fa64eb93fccd9e33b193485d6a2a8 WatchSource:0}: Error finding container dbdeebfa567437983439c92b15646caa142fa64eb93fccd9e33b193485d6a2a8: Status 404 returned error can't find the container with id dbdeebfa567437983439c92b15646caa142fa64eb93fccd9e33b193485d6a2a8 Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.892219 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.892493 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.392483195 +0000 UTC m=+208.992581625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.993323 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.996640 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.496591123 +0000 UTC m=+209.096689553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.045692 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-6j9qt" podStartSLOduration=161.045673888 podStartE2EDuration="2m41.045673888s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.044019148 +0000 UTC m=+208.644117578" watchObservedRunningTime="2026-03-16 00:10:00.045673888 +0000 UTC m=+208.645772318" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.100715 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.101362 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.601350209 +0000 UTC m=+209.201448639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.141136 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.141188 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lc9bv"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.141203 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5zxcb"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.164203 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t4lj8"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.181771 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560330-65dr5"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.194956 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-65dr5"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.194982 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" event={"ID":"53e96ad5-bed1-4cf2-acf0-7f61294d16a7","Type":"ContainerStarted","Data":"353e6260b7eaf4cf83f83e7aff56c6a55ad8ab23277aed21c0f04df9c5a57ad2"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.195100 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" event={"ID":"b227bf69-003e-4831-8ce3-a5b1f7f85c31","Type":"ContainerStarted","Data":"8597cc81eadbdec63ceeb97b6aee7d39085f76d02923b96d9f85bf9188c36341"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.195115 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" event={"ID":"143ccd96-ced1-466f-8891-72abc221bbac","Type":"ContainerStarted","Data":"dbdeebfa567437983439c92b15646caa142fa64eb93fccd9e33b193485d6a2a8"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.195182 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.196056 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.196635 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" event={"ID":"5373e962-abd6-4153-9cc9-7d17b9ae5fe5","Type":"ContainerStarted","Data":"2db08a5909d98bd7ece8fd48bbbc264af1806dddff0a85c34295e46ad5d3bce6"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.202546 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.202681 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.702655413 +0000 UTC m=+209.302753843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.203043 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.203299 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.703290232 +0000 UTC m=+209.303388662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.204628 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.205567 4983 generic.go:334] "Generic (PLEG): container finished" podID="d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7" containerID="b0a22b729384eceaec70338f2664d98b4fd349a61381f59e8d8000d93e6ea45c" exitCode=0 Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.205659 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" event={"ID":"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7","Type":"ContainerDied","Data":"b0a22b729384eceaec70338f2664d98b4fd349a61381f59e8d8000d93e6ea45c"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.207950 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" event={"ID":"31984625-3905-4d4d-9c52-e7d11c6c15d4","Type":"ContainerStarted","Data":"f4d420d9b34a817e92b62516fb78a06020b7ffa150dd9aef10049cb514baa18d"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.210694 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ml6pw"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.212878 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-82r5r" event={"ID":"55da5246-1df8-4666-ad7c-9407719b3abb","Type":"ContainerStarted","Data":"3bd5255589a4c515205e67bdf4f87eb0afab47190517d237d03c01ce52969d12"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.214202 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mjkh8" event={"ID":"211c2269-7173-4fcb-9403-be48b10ab364","Type":"ContainerStarted","Data":"6a33289fd77521cd37166fdf24cf77edbd7c3c31b54c359730b472bee23e39df"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.226279 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" event={"ID":"34398886-1821-47c0-bbff-951177287627","Type":"ContainerStarted","Data":"145e9218333e55acb9840b6fb949df13106ef40679a8fa644b1bc3725c1f8433"} Mar 16 00:10:00 crc kubenswrapper[4983]: W0316 00:10:00.228604 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod249f0516_0237_4ba3_92eb_a7aa3b9c62c1.slice/crio-568845ee7ac2c267cff9d54987aa3c87ae9f7d3363c12766495c05454443fca2 WatchSource:0}: Error finding container 568845ee7ac2c267cff9d54987aa3c87ae9f7d3363c12766495c05454443fca2: Status 404 returned error can't find the container with id 568845ee7ac2c267cff9d54987aa3c87ae9f7d3363c12766495c05454443fca2 Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.234496 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.235691 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w8qpq" event={"ID":"dfea0242-abc1-4912-a193-6c4dc75d9bb5","Type":"ContainerStarted","Data":"24d272d04c0560549b6742b323e69649413ed948f5548e408b9b1c261d2d388e"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.238503 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fp4l5" event={"ID":"d76474c2-7d5c-45a0-8869-d829b0c594d6","Type":"ContainerStarted","Data":"f1aed9c65d30ac039cadab304947e98f58048ca416d66ebdb897b989f91be90d"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.245979 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n22z7"] Mar 16 00:10:00 crc kubenswrapper[4983]: W0316 00:10:00.248460 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ac1b1cc_8499_493f_a8d9_801eb433163f.slice/crio-8665fea52c7fa59500f204d3f9650e9dc31f16bb6f517767fe49fb869cd3f2b0 WatchSource:0}: Error finding container 8665fea52c7fa59500f204d3f9650e9dc31f16bb6f517767fe49fb869cd3f2b0: Status 404 returned error can't find the container with id 8665fea52c7fa59500f204d3f9650e9dc31f16bb6f517767fe49fb869cd3f2b0 Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.248720 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" event={"ID":"6b0e4e23-a158-4597-b005-db088a652ec8","Type":"ContainerStarted","Data":"ed21a48833c174e4fcc13e352d4b1b175887dbf647f6297028b62e850ef69e92"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.253662 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-9tclx" event={"ID":"9f5bd50b-b197-4deb-ac50-768e3baa6cff","Type":"ContainerStarted","Data":"b9e245e332a00fe31e8a513f16d938a911b68f20bd84b7aa4a069280729c1f31"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.256098 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" event={"ID":"74d1b439-9506-4a1a-a1a4-3f5ca7944750","Type":"ContainerStarted","Data":"97b77baf4f8726300cf864bec54760df49ff9c95a4afffdfa6ee99e92da6566d"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.256782 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" event={"ID":"87a722ee-1078-41fd-bd5e-96981b43652d","Type":"ContainerStarted","Data":"1965cf54da33760615e034ca9db488c5481e59caf0aa16831ccaefaf972dbc39"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.259602 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" event={"ID":"94e46a85-c462-4ef3-a944-6ed47d2b0598","Type":"ContainerStarted","Data":"a857e30f706d81c0ba7ca15373135d1e1ab1827dd2b9bc97b0f5352aef4c79ae"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.269816 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560328-sngnj"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.272156 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" event={"ID":"aef72d9a-3e65-495f-8e73-ee539c10a29e","Type":"ContainerStarted","Data":"b5315a03863713f843c9b944f4b1c0c565229cc0e308c490f17a5db9d59c1391"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.273053 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.285660 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" event={"ID":"b6f53f35-efe7-4b1c-9a25-d82b824c156f","Type":"ContainerStarted","Data":"493e177d5c682b97ceed9fadde908b79b08dde1bb0fbaa3eb23b0c4b5f72a635"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.293389 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" event={"ID":"ebd00ffd-95e2-47bf-a6fd-663526b2283d","Type":"ContainerStarted","Data":"e8c35420a3bab1fb4cd8c47471ef668f1a6989227fd8ee85480c5e541dd8e2ec"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.299473 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" event={"ID":"8662dd30-6a4c-4a3d-a3bb-8d24821241fa","Type":"ContainerStarted","Data":"5333ca9cc1a5370bfe2d231afed36f38b2926a11831f08aa215141d60e61c169"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.301121 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.303534 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.303749 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtp9b\" (UniqueName: \"kubernetes.io/projected/c39b8480-5521-4ff7-b6ec-4f67009b1f5c-kube-api-access-wtp9b\") pod \"auto-csr-approver-29560330-65dr5\" (UID: \"c39b8480-5521-4ff7-b6ec-4f67009b1f5c\") " pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.303911 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.803847363 +0000 UTC m=+209.403945803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.304020 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.305676 4983 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-mw6rk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.305714 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" podUID="8662dd30-6a4c-4a3d-a3bb-8d24821241fa" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.306138 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" event={"ID":"bcce228b-5abb-4cbb-8f79-57326a3a9665","Type":"ContainerStarted","Data":"2db792fdc99be701472444abc112040e2e138288e6176d6d440b992e7c9d893a"} Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.306322 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.806291366 +0000 UTC m=+209.406389836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.313470 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" event={"ID":"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1","Type":"ContainerStarted","Data":"ba869e3e342ba09bef033d10bd950525d19fc3c166e1837dfa0dd87b1cca26b3"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.323502 4983 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.332468 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" event={"ID":"54a768f3-aa53-481d-b179-5c8807f69e89","Type":"ContainerStarted","Data":"fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.336898 4983 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9hbqr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.337137 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.338865 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" event={"ID":"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d","Type":"ContainerStarted","Data":"12cab49d37456bf346f64873c96c0b6f78f6ed32a45865a4e4e1d7e1b4d68a36"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.339710 4983 patch_prober.go:28] interesting pod/console-operator-58897d9998-lx4mf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.339765 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" podUID="ddcf712a-d77b-446c-b9e8-7083ff491d3c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.341221 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.341266 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.347990 4983 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-df6gg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.348033 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.405333 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.405724 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.905697063 +0000 UTC m=+209.505795493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.405787 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtp9b\" (UniqueName: \"kubernetes.io/projected/c39b8480-5521-4ff7-b6ec-4f67009b1f5c-kube-api-access-wtp9b\") pod \"auto-csr-approver-29560330-65dr5\" (UID: \"c39b8480-5521-4ff7-b6ec-4f67009b1f5c\") " pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.450037 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtp9b\" (UniqueName: \"kubernetes.io/projected/c39b8480-5521-4ff7-b6ec-4f67009b1f5c-kube-api-access-wtp9b\") pod \"auto-csr-approver-29560330-65dr5\" (UID: \"c39b8480-5521-4ff7-b6ec-4f67009b1f5c\") " pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.507654 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.509641 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.009607125 +0000 UTC m=+209.609705545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.515613 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" podStartSLOduration=162.515590743 podStartE2EDuration="2m42.515590743s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.465963532 +0000 UTC m=+209.066061962" watchObservedRunningTime="2026-03-16 00:10:00.515590743 +0000 UTC m=+209.115689183" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.553200 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.609526 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.609654 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.10962902 +0000 UTC m=+209.709727450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.609913 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.610206 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.110199327 +0000 UTC m=+209.710297757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.663063 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" podStartSLOduration=161.663047134 podStartE2EDuration="2m41.663047134s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.661452107 +0000 UTC m=+209.261550537" watchObservedRunningTime="2026-03-16 00:10:00.663047134 +0000 UTC m=+209.263145564" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.710735 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.710892 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.210864862 +0000 UTC m=+209.810963292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.711042 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.711340 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.211327845 +0000 UTC m=+209.811426275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.747018 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" podStartSLOduration=162.74699948 podStartE2EDuration="2m42.74699948s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.746523666 +0000 UTC m=+209.346622096" watchObservedRunningTime="2026-03-16 00:10:00.74699948 +0000 UTC m=+209.347097910" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.812372 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.812971 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.312956549 +0000 UTC m=+209.913054979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.830105 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29560320-9tclx" podStartSLOduration=162.83008508 podStartE2EDuration="2m42.83008508s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.828970367 +0000 UTC m=+209.429068797" watchObservedRunningTime="2026-03-16 00:10:00.83008508 +0000 UTC m=+209.430183510" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.869500 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-fp4l5" podStartSLOduration=161.869485716 podStartE2EDuration="2m41.869485716s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.865333742 +0000 UTC m=+209.465432172" watchObservedRunningTime="2026-03-16 00:10:00.869485716 +0000 UTC m=+209.469584146" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.915107 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.915488 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.415477259 +0000 UTC m=+210.015575689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.958304 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" podStartSLOduration=161.958288337 podStartE2EDuration="2m41.958288337s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.908513351 +0000 UTC m=+209.508611781" watchObservedRunningTime="2026-03-16 00:10:00.958288337 +0000 UTC m=+209.558386767" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.958610 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" podStartSLOduration=161.958605676 podStartE2EDuration="2m41.958605676s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.95707157 +0000 UTC m=+209.557170000" watchObservedRunningTime="2026-03-16 00:10:00.958605676 +0000 UTC m=+209.558704106" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.016100 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.016655 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.516639068 +0000 UTC m=+210.116737488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.016854 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.016941 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.017012 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.017082 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.018909 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.518891095 +0000 UTC m=+210.118989525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.022606 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.025296 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.026253 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.083386 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-65dr5"] Mar 16 00:10:01 crc kubenswrapper[4983]: W0316 00:10:01.101074 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc39b8480_5521_4ff7_b6ec_4f67009b1f5c.slice/crio-7852b21a7717c8a01be82d6d5cde8dd356e30ed41d3089cd4c321389eb11b980 WatchSource:0}: Error finding container 7852b21a7717c8a01be82d6d5cde8dd356e30ed41d3089cd4c321389eb11b980: Status 404 returned error can't find the container with id 7852b21a7717c8a01be82d6d5cde8dd356e30ed41d3089cd4c321389eb11b980 Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.119333 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.119705 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.119892 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.119899 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.61987936 +0000 UTC m=+210.219977790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.124315 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.129607 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.146716 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.159488 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.221053 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.221434 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.72141853 +0000 UTC m=+210.321516950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.322622 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.331741 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.831708482 +0000 UTC m=+210.431806912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.331878 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.332165 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.832155455 +0000 UTC m=+210.432253885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.348925 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" event={"ID":"22b9ac88-75ea-4572-bd27-f819caf4d8e2","Type":"ContainerStarted","Data":"db44784233b9ad9415444196abd7c2faf0178b2ea7e916574b985ac8f9ee8bdc"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.348965 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" event={"ID":"22b9ac88-75ea-4572-bd27-f819caf4d8e2","Type":"ContainerStarted","Data":"fc830bbf4e44f41be59d2028c77e60219664f9f57787783c802978e339a499c4"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.350207 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.359996 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" event={"ID":"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a","Type":"ContainerStarted","Data":"0afe989832c835fb78017edd78caeb4fb8bd8816c3ff94dd67dc89d5503b5838"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.377305 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" event={"ID":"b227bf69-003e-4831-8ce3-a5b1f7f85c31","Type":"ContainerStarted","Data":"d0fe8155e1db748bfd6f6a3c0969379dbdd3651f5523ef5689ebe917e0ce2b31"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.384025 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" event={"ID":"8ac1b1cc-8499-493f-a8d9-801eb433163f","Type":"ContainerStarted","Data":"8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.384104 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" event={"ID":"8ac1b1cc-8499-493f-a8d9-801eb433163f","Type":"ContainerStarted","Data":"8665fea52c7fa59500f204d3f9650e9dc31f16bb6f517767fe49fb869cd3f2b0"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.385902 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-82r5r" event={"ID":"55da5246-1df8-4666-ad7c-9407719b3abb","Type":"ContainerStarted","Data":"de6be5c08ae54bf0f30e8f28268227d5ca4e161c02d894131f116068fe5fb4c7"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.400095 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" event={"ID":"2688c073-5209-4258-a681-186370d9abcc","Type":"ContainerStarted","Data":"9aa5f237308263393189ed9b77b9ee06c8bb53be68ab9fde971ad882c8563d6a"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.426733 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" podStartSLOduration=162.426707197 podStartE2EDuration="2m42.426707197s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.418843663 +0000 UTC m=+210.018942093" watchObservedRunningTime="2026-03-16 00:10:01.426707197 +0000 UTC m=+210.026805647" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.427442 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.433087 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.433443 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.933426398 +0000 UTC m=+210.533524828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.433501 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.434177 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.93417005 +0000 UTC m=+210.534268480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.472743 4983 generic.go:334] "Generic (PLEG): container finished" podID="9c737bbb-9153-4689-bbd7-1925cd53b343" containerID="62141da8c2526e97aa1822a7f3f641b9ed6f2d142101bf4f93cbc2e90f300eea" exitCode=0 Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.472811 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" event={"ID":"9c737bbb-9153-4689-bbd7-1925cd53b343","Type":"ContainerDied","Data":"62141da8c2526e97aa1822a7f3f641b9ed6f2d142101bf4f93cbc2e90f300eea"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.486122 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" event={"ID":"249f0516-0237-4ba3-92eb-a7aa3b9c62c1","Type":"ContainerStarted","Data":"568845ee7ac2c267cff9d54987aa3c87ae9f7d3363c12766495c05454443fca2"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.503639 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-82r5r" podStartSLOduration=6.503620123 podStartE2EDuration="6.503620123s" podCreationTimestamp="2026-03-16 00:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.450554129 +0000 UTC m=+210.050652579" watchObservedRunningTime="2026-03-16 00:10:01.503620123 +0000 UTC m=+210.103718553" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.504381 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" event={"ID":"33ff8ce9-2d36-4251-ae9d-802d9965bfde","Type":"ContainerStarted","Data":"f43fc07ca791e8d3877807f155909f93329aa455340ee15d5dec03ee7b00c13d"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.511996 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560328-sngnj" event={"ID":"9da42bf3-da76-4db7-9653-f2f08567084f","Type":"ContainerStarted","Data":"fde617a4855b193426c3b4102e81b29ab0d3e6c44d90e708f2f6bda3bb35ebf8"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.515320 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" event={"ID":"aef72d9a-3e65-495f-8e73-ee539c10a29e","Type":"ContainerStarted","Data":"32982a122c4230e09d6194d06c3188d9b95d47f1e5ee640325ee4909c9ed26c5"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.517316 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-65dr5" event={"ID":"c39b8480-5521-4ff7-b6ec-4f67009b1f5c","Type":"ContainerStarted","Data":"7852b21a7717c8a01be82d6d5cde8dd356e30ed41d3089cd4c321389eb11b980"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.519809 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" event={"ID":"0153d604-68c6-465e-9714-463f0e7e4c41","Type":"ContainerStarted","Data":"2f6ac418ab83db7361af1e5d0897d96c9e84cd20e3d27e7aa8176847f1f3a492"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.521599 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" event={"ID":"de84a408-0f98-48c6-83a5-e6976b576989","Type":"ContainerStarted","Data":"aeb14511e9d7abc3a665b28b92162b19c96ca2a289f5d44810a7c3dd5113a574"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.523774 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" event={"ID":"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d","Type":"ContainerStarted","Data":"ff995c0679f94df40aa9c87e2609ad1cfe3da3323aee4535dcb64f151df65d57"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.524412 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.531316 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" event={"ID":"31984625-3905-4d4d-9c52-e7d11c6c15d4","Type":"ContainerStarted","Data":"7d17b520540e29ad54b4b7565c0ea7d810bd02ec7c86200a0cf535f85d956f87"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.532678 4983 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-ql6v2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.532744 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" podUID="f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.535312 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.535812 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.035769003 +0000 UTC m=+210.635867453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.535999 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.536985 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" event={"ID":"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f","Type":"ContainerStarted","Data":"dbf20a3ef1ed3e47b6b6d1f462e21aac4f1f95b0cb81df81c7adac4f00b18da0"} Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.537420 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.037407462 +0000 UTC m=+210.637505942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.553279 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" podStartSLOduration=162.553257165 podStartE2EDuration="2m42.553257165s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.532148275 +0000 UTC m=+210.132246715" watchObservedRunningTime="2026-03-16 00:10:01.553257165 +0000 UTC m=+210.153355595" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.553616 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" podStartSLOduration=162.553608635 podStartE2EDuration="2m42.553608635s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.552947105 +0000 UTC m=+210.153045545" watchObservedRunningTime="2026-03-16 00:10:01.553608635 +0000 UTC m=+210.153707065" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.566326 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" event={"ID":"34398886-1821-47c0-bbff-951177287627","Type":"ContainerStarted","Data":"e9babf5e0b9fb5ed9b0059cdd6c59bc430c9fae68c880bbfb9f80d6e711e2a4a"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.576372 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w8qpq" event={"ID":"dfea0242-abc1-4912-a193-6c4dc75d9bb5","Type":"ContainerStarted","Data":"42e4540e74db8d1e15507582abac8245193e5683a94a2cf1102c1a10a2a3265a"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.591546 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" podStartSLOduration=162.591525557 podStartE2EDuration="2m42.591525557s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.583463706 +0000 UTC m=+210.183562136" watchObservedRunningTime="2026-03-16 00:10:01.591525557 +0000 UTC m=+210.191623997" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.608550 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-w8qpq" podStartSLOduration=162.608529464 podStartE2EDuration="2m42.608529464s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.60805554 +0000 UTC m=+210.208153990" watchObservedRunningTime="2026-03-16 00:10:01.608529464 +0000 UTC m=+210.208627894" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.609354 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5zxcb" event={"ID":"9c413c46-e4ff-43f2-b66a-8a62e1f08890","Type":"ContainerStarted","Data":"2a7d1438a2f4b768ee017c4a995fbace90b444f599c615beb69ed4a8dbbf2535"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.609398 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5zxcb" event={"ID":"9c413c46-e4ff-43f2-b66a-8a62e1f08890","Type":"ContainerStarted","Data":"19f550fedf916378df938f466a725fe092180d7e9338e70c9778f0b22704bf3a"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.620074 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" event={"ID":"ebd00ffd-95e2-47bf-a6fd-663526b2283d","Type":"ContainerStarted","Data":"720cccf4750e93f0cac636b8f690f69530885b94b4b99cfe81b8d8691bb0ac1b"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.654306 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.655804 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.155787215 +0000 UTC m=+210.755885645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.688271 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" event={"ID":"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7","Type":"ContainerStarted","Data":"cbad8360383812db201164e02dc84201e30bf6e665fac7ec7d9208decb400509"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.689018 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.716918 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" podStartSLOduration=162.716904479 podStartE2EDuration="2m42.716904479s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.715288531 +0000 UTC m=+210.315386961" watchObservedRunningTime="2026-03-16 00:10:01.716904479 +0000 UTC m=+210.317002909" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.717382 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5zxcb" podStartSLOduration=6.717377743 podStartE2EDuration="6.717377743s" podCreationTimestamp="2026-03-16 00:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.654054213 +0000 UTC m=+210.254152643" watchObservedRunningTime="2026-03-16 00:10:01.717377743 +0000 UTC m=+210.317476173" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.746803 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" event={"ID":"5373e962-abd6-4153-9cc9-7d17b9ae5fe5","Type":"ContainerStarted","Data":"5fcb8f3e6330c0e897c5e39d3a96b822b82c1d6ad194d1ebe0d1b9e57e4887db"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.754901 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" event={"ID":"8820c8ae-e5d3-4c91-8724-ec666e783179","Type":"ContainerStarted","Data":"e0dc80f80d5d7392c25a4334242663a0ba9cb67f7c9729fd71d5c1f6339b83bc"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.755716 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.757636 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.257623204 +0000 UTC m=+210.857721634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.760068 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" event={"ID":"94e46a85-c462-4ef3-a944-6ed47d2b0598","Type":"ContainerStarted","Data":"f41daa53721aa931e3c04682ddc91e1dc1bad874e3d3733d9fb034b95a99f1ad"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.762938 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mjkh8" event={"ID":"211c2269-7173-4fcb-9403-be48b10ab364","Type":"ContainerStarted","Data":"210f991cfbd379418f978742f9e5ee3dc1c4f7f781e87153dab509a41a256d8c"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.776696 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" event={"ID":"61000119-35ce-40ee-a8c5-5ad9052b539d","Type":"ContainerStarted","Data":"670f5a893936b8ad96a01d0355ddc5d7d757a88f85618efe4199c69a6ead4fd8"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.782722 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" event={"ID":"74d1b439-9506-4a1a-a1a4-3f5ca7944750","Type":"ContainerStarted","Data":"be8da38062686a4afdd070377ef603c986ba63707735ff708762f4c2c2a1bd61"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.788857 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" podStartSLOduration=162.788838676 podStartE2EDuration="2m42.788838676s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.787683221 +0000 UTC m=+210.387781651" watchObservedRunningTime="2026-03-16 00:10:01.788838676 +0000 UTC m=+210.388937106" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.804170 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" event={"ID":"143ccd96-ced1-466f-8891-72abc221bbac","Type":"ContainerStarted","Data":"6aef27281bf276883a364fa083d17698ca8634526f370fd890cb269151edb872"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.807854 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" event={"ID":"6b0e4e23-a158-4597-b005-db088a652ec8","Type":"ContainerStarted","Data":"22de531062765275223bfd19cc2f919cc47730fd4df219c2ba7c9232ff4ac956"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.820135 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.820185 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.820615 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" event={"ID":"87a722ee-1078-41fd-bd5e-96981b43652d","Type":"ContainerStarted","Data":"44c3726454774541747806cc8af6b91f789705e9d629f767cefb9962a9601f36"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.822139 4983 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9hbqr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.822279 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.832683 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.832748 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.832865 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.832905 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.833018 4983 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-mw6rk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.833051 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" podUID="8662dd30-6a4c-4a3d-a3bb-8d24821241fa" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.834283 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" podStartSLOduration=162.834249931 podStartE2EDuration="2m42.834249931s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.813130501 +0000 UTC m=+210.413228931" watchObservedRunningTime="2026-03-16 00:10:01.834249931 +0000 UTC m=+210.434348361" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.836105 4983 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tj49l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.836143 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.845050 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" podStartSLOduration=162.845033853 podStartE2EDuration="2m42.845033853s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.839652943 +0000 UTC m=+210.439751373" watchObservedRunningTime="2026-03-16 00:10:01.845033853 +0000 UTC m=+210.445132283" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.856853 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.856969 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.356948599 +0000 UTC m=+210.957047029 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.857169 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.859553 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.359537486 +0000 UTC m=+210.959635916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.916980 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" podStartSLOduration=162.916963259 podStartE2EDuration="2m42.916963259s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.87944251 +0000 UTC m=+210.479540940" watchObservedRunningTime="2026-03-16 00:10:01.916963259 +0000 UTC m=+210.517061679" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.931259 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" podStartSLOduration=162.931243535 podStartE2EDuration="2m42.931243535s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.914837246 +0000 UTC m=+210.514935676" watchObservedRunningTime="2026-03-16 00:10:01.931243535 +0000 UTC m=+210.531341965" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.933543 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" podStartSLOduration=162.933537154 podStartE2EDuration="2m42.933537154s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.932270386 +0000 UTC m=+210.532368816" watchObservedRunningTime="2026-03-16 00:10:01.933537154 +0000 UTC m=+210.533635574" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.956217 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" podStartSLOduration=162.95620014 podStartE2EDuration="2m42.95620014s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.954970774 +0000 UTC m=+210.555069204" watchObservedRunningTime="2026-03-16 00:10:01.95620014 +0000 UTC m=+210.556298570" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.958265 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.959625 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.459611962 +0000 UTC m=+211.059710392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.058346 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qvtjp"] Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.060035 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.060357 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.560345829 +0000 UTC m=+211.160444259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.161635 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.161991 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.661949721 +0000 UTC m=+211.262048151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.162119 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.162612 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.662599081 +0000 UTC m=+211.262697511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.262669 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.262992 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.762972886 +0000 UTC m=+211.363071316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.263210 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.263710 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.763701818 +0000 UTC m=+211.363800248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.364770 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.364936 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.864915759 +0000 UTC m=+211.465014189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.365128 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.365450 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.865439355 +0000 UTC m=+211.465537785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.466998 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.467154 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.96712796 +0000 UTC m=+211.567226400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.467220 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.467571 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.967561973 +0000 UTC m=+211.567660403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.568701 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.568902 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.068876517 +0000 UTC m=+211.668974947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.569243 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.569526 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.069513276 +0000 UTC m=+211.669611706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.670002 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.670357 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.170342425 +0000 UTC m=+211.770440855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.770963 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.771307 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.271289468 +0000 UTC m=+211.871387918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.820935 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.820982 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.868976 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" event={"ID":"0153d604-68c6-465e-9714-463f0e7e4c41","Type":"ContainerStarted","Data":"81c835875b0da5ad00c0eef0ef68928bd1f88ad221a8ad83b11565521d53a877"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.872415 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.872869 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.3728494 +0000 UTC m=+211.972947830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.878885 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mjkh8" event={"ID":"211c2269-7173-4fcb-9403-be48b10ab364","Type":"ContainerStarted","Data":"cb4b2aacb2be069cb4c1685aaa72201857f54a52832e05eebd106f9c143a5354"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.879148 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-mjkh8" Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.896891 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" event={"ID":"9c737bbb-9153-4689-bbd7-1925cd53b343","Type":"ContainerStarted","Data":"903a98ca3fc8090a6bba6e9a74bfd394cc0e3a734ad7457095bd40fd0923d09a"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.908022 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"dcbe45123839f013b4b0ceffab4b0cee00aa8f3ec5219a7fb1dfc90d1e5eff99"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.908076 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"06ddd8beef35d8422a3e7f922e22433833d8360b1c8b3f1f7472018fccc21447"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.908677 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.915071 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" podStartSLOduration=164.915038779 podStartE2EDuration="2m44.915038779s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:02.892608779 +0000 UTC m=+211.492707209" watchObservedRunningTime="2026-03-16 00:10:02.915038779 +0000 UTC m=+211.515137209" Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.943042 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mjkh8" podStartSLOduration=7.943019124 podStartE2EDuration="7.943019124s" podCreationTimestamp="2026-03-16 00:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:02.911893805 +0000 UTC m=+211.511992255" watchObservedRunningTime="2026-03-16 00:10:02.943019124 +0000 UTC m=+211.543117554" Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.945748 4983 generic.go:334] "Generic (PLEG): container finished" podID="249f0516-0237-4ba3-92eb-a7aa3b9c62c1" containerID="6a5c37f8634f9377e518dee2138e5fcfd25e9fa89f065c718af9889083a0f35a" exitCode=0 Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.945859 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" event={"ID":"249f0516-0237-4ba3-92eb-a7aa3b9c62c1","Type":"ContainerDied","Data":"6a5c37f8634f9377e518dee2138e5fcfd25e9fa89f065c718af9889083a0f35a"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.957334 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" podStartSLOduration=163.957312451 podStartE2EDuration="2m43.957312451s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:02.956454175 +0000 UTC m=+211.556552605" watchObservedRunningTime="2026-03-16 00:10:02.957312451 +0000 UTC m=+211.557410881" Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.966778 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" event={"ID":"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a","Type":"ContainerStarted","Data":"acdc4e189ccf4e761d6043513cbad293003048c67a648d207e8f75197775a5cf"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.985427 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.986555 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.486537803 +0000 UTC m=+212.086636303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.990192 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d3912b7b9d1bd2499676a4f638d603dc9dbee76b6a15b4e26df0827023be11e9"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.990234 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5b0530f23d114cca92c1d71521e7fcfd5ec54179eba9c0dc7e38700236d16627"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.004703 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" event={"ID":"de84a408-0f98-48c6-83a5-e6976b576989","Type":"ContainerStarted","Data":"4337cad217a97d1f2298838954e7ca6bd588c7455c966b033c070741a57ef710"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.004741 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" event={"ID":"de84a408-0f98-48c6-83a5-e6976b576989","Type":"ContainerStarted","Data":"447747f164006bcfcf2a9ddf6febc450a45b097b5067b51e460ed9a461ff8370"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.004973 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.012094 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" event={"ID":"6b0e4e23-a158-4597-b005-db088a652ec8","Type":"ContainerStarted","Data":"a83be389e601cd0deeac5a22896d87007bfbd4fcb5c5277d680cd9b94bcebd7e"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.017143 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9ad31635a8bdd45a91fd6f1e45def5a319d6c3c06026587eb7989f6f76d2be8a"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.017190 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"bd6198faed51abc42a3fd4706252c5b14a466a8b93f87fb8628ca9bd7e3c74a1"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.019841 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" podStartSLOduration=164.019829506 podStartE2EDuration="2m44.019829506s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.014516668 +0000 UTC m=+211.614615098" watchObservedRunningTime="2026-03-16 00:10:03.019829506 +0000 UTC m=+211.619927936" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.035824 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" event={"ID":"6993dda4-ac10-47af-b406-d49d7781fbe5","Type":"ContainerStarted","Data":"09c2019faf6c1afbd5f21bf9ee34e82dda5d63d96d1b396605aef92fe6ba01c0"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.035883 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" event={"ID":"6993dda4-ac10-47af-b406-d49d7781fbe5","Type":"ContainerStarted","Data":"ad1bdf4b5b5bf503936d824be0bbab593dff431a6c2e73a488599b99cc8935f7"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.041913 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" event={"ID":"2688c073-5209-4258-a681-186370d9abcc","Type":"ContainerStarted","Data":"33bfa229c8d0a420615966bd13f8a47649b744e9e51c7ff16e5d3678e3862f6a"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.071771 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" event={"ID":"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f","Type":"ContainerStarted","Data":"601daed22a504039799d8c8241a370755a02a384e33777087b32aa2a5d19b047"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.082713 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.083081 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.090998 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.091095 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.591076433 +0000 UTC m=+212.191174863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.091242 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.093802 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.593791384 +0000 UTC m=+212.193889814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.095420 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" event={"ID":"61000119-35ce-40ee-a8c5-5ad9052b539d","Type":"ContainerStarted","Data":"17ad26d1a217c35c863e0de5f9a8aaeaa7505645e51dd84b4453328d77141812"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.095937 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.096053 4983 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-56ljn container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.096083 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" podUID="9c737bbb-9153-4689-bbd7-1925cd53b343" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.098835 4983 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mhd52 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.098866 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" podUID="61000119-35ce-40ee-a8c5-5ad9052b539d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.110449 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" podStartSLOduration=164.110429641 podStartE2EDuration="2m44.110429641s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.101953588 +0000 UTC m=+211.702052018" watchObservedRunningTime="2026-03-16 00:10:03.110429641 +0000 UTC m=+211.710528071" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.140075 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" event={"ID":"22b9ac88-75ea-4572-bd27-f819caf4d8e2","Type":"ContainerStarted","Data":"8ce704030e1988d292db3e568f8d509b352df1c6490541930e4e6237b579a9fb"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.153113 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" event={"ID":"33ff8ce9-2d36-4251-ae9d-802d9965bfde","Type":"ContainerStarted","Data":"c97d7de6e52473a5f89c81863fac5338131db5dae8531818aeae4c774d1af7e8"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.153154 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" event={"ID":"33ff8ce9-2d36-4251-ae9d-802d9965bfde","Type":"ContainerStarted","Data":"f8a460c7ca84aeeb996af8f495c1120ff629101e9cc8957095906323e0a19a51"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.172553 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" podStartSLOduration=165.172538464 podStartE2EDuration="2m45.172538464s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.170298918 +0000 UTC m=+211.770397358" watchObservedRunningTime="2026-03-16 00:10:03.172538464 +0000 UTC m=+211.772636894" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.173219 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" podStartSLOduration=164.173212695 podStartE2EDuration="2m44.173212695s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.13787538 +0000 UTC m=+211.737973810" watchObservedRunningTime="2026-03-16 00:10:03.173212695 +0000 UTC m=+211.773311125" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.176280 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" event={"ID":"31984625-3905-4d4d-9c52-e7d11c6c15d4","Type":"ContainerStarted","Data":"408ad5fdd3cf2f8905db91d37047012b541907ae6f543d36c294aebc0c9a0470"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.182716 4983 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-ql6v2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.182779 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" podUID="f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.183013 4983 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-rvjb2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.187588 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" podUID="8ac1b1cc-8499-493f-a8d9-801eb433163f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.183262 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.183142 4983 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tj49l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.187720 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.195273 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.196411 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.696389746 +0000 UTC m=+212.296488176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.202924 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qvtjp" podStartSLOduration=164.202908241 podStartE2EDuration="2m44.202908241s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.201398196 +0000 UTC m=+211.801496626" watchObservedRunningTime="2026-03-16 00:10:03.202908241 +0000 UTC m=+211.803006681" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.297399 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.303449 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.803432681 +0000 UTC m=+212.403531131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.321157 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" podStartSLOduration=164.32113926 podStartE2EDuration="2m44.32113926s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.303846054 +0000 UTC m=+211.903944484" watchObservedRunningTime="2026-03-16 00:10:03.32113926 +0000 UTC m=+211.921237680" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.381032 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" podStartSLOduration=164.381014257 podStartE2EDuration="2m44.381014257s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.369291817 +0000 UTC m=+211.969390247" watchObservedRunningTime="2026-03-16 00:10:03.381014257 +0000 UTC m=+211.981112687" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.399556 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.400109 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.900094116 +0000 UTC m=+212.500192546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.487418 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" podStartSLOduration=164.487402792 podStartE2EDuration="2m44.487402792s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.487079533 +0000 UTC m=+212.087177963" watchObservedRunningTime="2026-03-16 00:10:03.487402792 +0000 UTC m=+212.087501212" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.501663 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.502016 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.002003828 +0000 UTC m=+212.602102258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.583559 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" podStartSLOduration=164.583542622 podStartE2EDuration="2m44.583542622s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.578377328 +0000 UTC m=+212.178475758" watchObservedRunningTime="2026-03-16 00:10:03.583542622 +0000 UTC m=+212.183641042" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.605859 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.606039 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.106014852 +0000 UTC m=+212.706113282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.606106 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.607295 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.10728274 +0000 UTC m=+212.707381170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.651916 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" podStartSLOduration=164.651897152 podStartE2EDuration="2m44.651897152s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.64681541 +0000 UTC m=+212.246913860" watchObservedRunningTime="2026-03-16 00:10:03.651897152 +0000 UTC m=+212.251995582" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.708840 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.709242 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.209225313 +0000 UTC m=+212.809323753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.813610 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.814002 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.31398607 +0000 UTC m=+212.914084500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.827986 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:03 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:03 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:03 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.828043 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.923807 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.924152 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.424137557 +0000 UTC m=+213.024235987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.025566 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.025947 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.525937056 +0000 UTC m=+213.126035486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.127078 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.127349 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.627335212 +0000 UTC m=+213.227433642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.192669 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" event={"ID":"6993dda4-ac10-47af-b406-d49d7781fbe5","Type":"ContainerStarted","Data":"2831085478f698d01baf77ca8701f5dd942580b8bae8558e6ac9202616b0ff9d"} Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.206666 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" event={"ID":"249f0516-0237-4ba3-92eb-a7aa3b9c62c1","Type":"ContainerStarted","Data":"31366c6413075ef6992692d577cecf196661d413fe9923130b380f38340fd868"} Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.214981 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" event={"ID":"8820c8ae-e5d3-4c91-8724-ec666e783179","Type":"ContainerStarted","Data":"0c2c1a0046a1102ba0d7e2abbd5c28bec7b5e06f88bf9eaf2b7e311fd297f03a"} Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.218569 4983 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tj49l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.218633 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.218689 4983 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-np9wn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.218730 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" podUID="d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.228966 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.229510 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.729465331 +0000 UTC m=+213.329563761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.256117 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.296353 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.308864 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" podStartSLOduration=165.3088476 podStartE2EDuration="2m45.3088476s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.684086483 +0000 UTC m=+212.284184913" watchObservedRunningTime="2026-03-16 00:10:04.3088476 +0000 UTC m=+212.908946020" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.330264 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.330410 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.830386773 +0000 UTC m=+213.430485203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.331013 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.334550 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.834532507 +0000 UTC m=+213.434631017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.432276 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.432435 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.932409938 +0000 UTC m=+213.532508368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.432648 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.432985 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.932971335 +0000 UTC m=+213.533069765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.534291 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.534410 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.034388412 +0000 UTC m=+213.634486842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.534513 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.534867 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.034855846 +0000 UTC m=+213.634954266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.576351 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.640336 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.640367 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.140348434 +0000 UTC m=+213.740446864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.640782 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.641092 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.141082566 +0000 UTC m=+213.741180996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.741724 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.741906 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.241882975 +0000 UTC m=+213.841981405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.741934 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.742291 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.242283397 +0000 UTC m=+213.842381827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.826250 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:04 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:04 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:04 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.826315 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.842930 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.843248 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.343160058 +0000 UTC m=+213.943258488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.843319 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.843699 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.343683503 +0000 UTC m=+213.943781933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.904440 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44210: no serving certificate available for the kubelet" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.945144 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.945413 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.445397609 +0000 UTC m=+214.045496039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.006354 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44216: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.046282 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.046680 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.546663672 +0000 UTC m=+214.146762092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.112647 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44224: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.147916 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.148232 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.648217373 +0000 UTC m=+214.248315793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.227919 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44226: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.229993 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" event={"ID":"249f0516-0237-4ba3-92eb-a7aa3b9c62c1","Type":"ContainerStarted","Data":"7b6580d427d37b425db9500c884957b95c83c13d7f5ab0a1b2e5388690548529"} Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.249732 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.250133 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.750121474 +0000 UTC m=+214.350219904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.325615 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44228: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.351475 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.352499 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.85248356 +0000 UTC m=+214.452581990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.457542 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.457901 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.957889116 +0000 UTC m=+214.557987536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.558839 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.559218 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.059202618 +0000 UTC m=+214.659301048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.577165 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44234: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.599120 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" podStartSLOduration=167.599102469 podStartE2EDuration="2m47.599102469s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:05.267226135 +0000 UTC m=+213.867324565" watchObservedRunningTime="2026-03-16 00:10:05.599102469 +0000 UTC m=+214.199200899" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.602093 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vxnxc"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.602995 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.612029 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.629085 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vxnxc"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.643081 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44248: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.672665 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hsgsl"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.678788 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680261 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680309 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-catalog-content\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680337 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm28s\" (UniqueName: \"kubernetes.io/projected/8fd3d4ca-4839-4327-8121-fe6ba21051da-kube-api-access-cm28s\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680355 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-catalog-content\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680395 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-utilities\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680411 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbbl2\" (UniqueName: \"kubernetes.io/projected/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-kube-api-access-xbbl2\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680445 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-utilities\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.680687 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.180676204 +0000 UTC m=+214.780774634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.683242 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.783259 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.783780 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.28373857 +0000 UTC m=+214.883837000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.783866 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-utilities\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.783917 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbbl2\" (UniqueName: \"kubernetes.io/projected/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-kube-api-access-xbbl2\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.783992 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-utilities\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.784036 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.784081 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-catalog-content\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.784120 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm28s\" (UniqueName: \"kubernetes.io/projected/8fd3d4ca-4839-4327-8121-fe6ba21051da-kube-api-access-cm28s\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.784151 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-catalog-content\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.784499 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.284488833 +0000 UTC m=+214.884587263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.784616 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-utilities\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.784964 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-catalog-content\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.785026 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-utilities\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.785150 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-catalog-content\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.789427 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hsgsl"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.806957 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbbl2\" (UniqueName: \"kubernetes.io/projected/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-kube-api-access-xbbl2\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.824680 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm28s\" (UniqueName: \"kubernetes.io/projected/8fd3d4ca-4839-4327-8121-fe6ba21051da-kube-api-access-cm28s\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.827387 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:05 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:05 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:05 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.827438 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.862396 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.863004 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.872168 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.872366 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.887393 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44262: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.887453 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-txzqn"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.888461 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.888843 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.889018 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.388998622 +0000 UTC m=+214.989097052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.889608 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.889654 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.889692 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-catalog-content\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.889779 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqln2\" (UniqueName: \"kubernetes.io/projected/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-kube-api-access-bqln2\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.889822 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.889858 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-utilities\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.890382 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.390361153 +0000 UTC m=+214.990459583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.899431 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.919917 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-txzqn"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.931186 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.002594 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.002924 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqln2\" (UniqueName: \"kubernetes.io/projected/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-kube-api-access-bqln2\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.002962 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.002997 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-utilities\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.003023 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.003054 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-catalog-content\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.003562 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-catalog-content\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.003619 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.005224 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.50520209 +0000 UTC m=+215.105300520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.005460 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.015463 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-utilities\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.069742 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqln2\" (UniqueName: \"kubernetes.io/projected/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-kube-api-access-bqln2\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.086500 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.106787 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.107204 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.607187594 +0000 UTC m=+215.207286024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.128302 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sv5g7"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.129195 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sv5g7"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.129274 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.211316 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.211668 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.711652872 +0000 UTC m=+215.311751302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.211747 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.231069 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.324281 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-catalog-content\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.324644 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.324804 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-utilities\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.325167 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gpf\" (UniqueName: \"kubernetes.io/projected/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-kube-api-access-x4gpf\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.340174 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.840146257 +0000 UTC m=+215.440244687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.427964 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.428215 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-utilities\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.428289 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gpf\" (UniqueName: \"kubernetes.io/projected/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-kube-api-access-x4gpf\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.428332 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-catalog-content\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.429136 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-catalog-content\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.429212 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.929197045 +0000 UTC m=+215.529295475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.429400 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-utilities\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.480783 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gpf\" (UniqueName: \"kubernetes.io/projected/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-kube-api-access-x4gpf\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.518293 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9hbqr"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.518486 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" containerID="cri-o://fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756" gracePeriod=30 Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.529169 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.529504 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.029491459 +0000 UTC m=+215.629589879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.548142 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.550729 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.552057 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" podUID="8ac1b1cc-8499-493f-a8d9-801eb433163f" containerName="route-controller-manager" containerID="cri-o://8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013" gracePeriod=30 Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.611744 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44276: no serving certificate available for the kubelet" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.619044 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vxnxc"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.630202 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.630719 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.130524794 +0000 UTC m=+215.730623224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: W0316 00:10:06.730274 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf617dbbc_f757_49b9_b8c6_7d0c07cb197e.slice/crio-560de43e4286295f7d460584c287dcc2fb86a8274cec5a292335c9438faa954b WatchSource:0}: Error finding container 560de43e4286295f7d460584c287dcc2fb86a8274cec5a292335c9438faa954b: Status 404 returned error can't find the container with id 560de43e4286295f7d460584c287dcc2fb86a8274cec5a292335c9438faa954b Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.732043 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.732326 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.232314382 +0000 UTC m=+215.832412802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.757843 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.769102 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.829488 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:06 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:06 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:06 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.829848 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.837462 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.837818 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.337799161 +0000 UTC m=+215.937897591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.842833 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-txzqn"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.886164 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.888033 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hsgsl"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.942106 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.942610 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.442594089 +0000 UTC m=+216.042692519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.030877 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.042886 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.043551 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.543533211 +0000 UTC m=+216.143631651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.145449 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zjz9\" (UniqueName: \"kubernetes.io/projected/54a768f3-aa53-481d-b179-5c8807f69e89-kube-api-access-9zjz9\") pod \"54a768f3-aa53-481d-b179-5c8807f69e89\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.145634 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-client-ca\") pod \"54a768f3-aa53-481d-b179-5c8807f69e89\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.145663 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54a768f3-aa53-481d-b179-5c8807f69e89-serving-cert\") pod \"54a768f3-aa53-481d-b179-5c8807f69e89\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.145702 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-proxy-ca-bundles\") pod \"54a768f3-aa53-481d-b179-5c8807f69e89\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.145732 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-config\") pod \"54a768f3-aa53-481d-b179-5c8807f69e89\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.146586 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-config" (OuterVolumeSpecName: "config") pod "54a768f3-aa53-481d-b179-5c8807f69e89" (UID: "54a768f3-aa53-481d-b179-5c8807f69e89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.146788 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-client-ca" (OuterVolumeSpecName: "client-ca") pod "54a768f3-aa53-481d-b179-5c8807f69e89" (UID: "54a768f3-aa53-481d-b179-5c8807f69e89"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.147054 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "54a768f3-aa53-481d-b179-5c8807f69e89" (UID: "54a768f3-aa53-481d-b179-5c8807f69e89"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.148470 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.148825 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.148838 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.148847 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.149079 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.649066621 +0000 UTC m=+216.249165051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.150257 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.158991 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a768f3-aa53-481d-b179-5c8807f69e89-kube-api-access-9zjz9" (OuterVolumeSpecName: "kube-api-access-9zjz9") pod "54a768f3-aa53-481d-b179-5c8807f69e89" (UID: "54a768f3-aa53-481d-b179-5c8807f69e89"). InnerVolumeSpecName "kube-api-access-9zjz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.179214 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a768f3-aa53-481d-b179-5c8807f69e89-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "54a768f3-aa53-481d-b179-5c8807f69e89" (UID: "54a768f3-aa53-481d-b179-5c8807f69e89"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.229161 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76ff476bcc-pgmwb"] Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.229352 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.229363 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.229372 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac1b1cc-8499-493f-a8d9-801eb433163f" containerName="route-controller-manager" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.229378 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac1b1cc-8499-493f-a8d9-801eb433163f" containerName="route-controller-manager" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.229468 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.229479 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac1b1cc-8499-493f-a8d9-801eb433163f" containerName="route-controller-manager" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.229803 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.248835 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.249596 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.249628 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-config\") pod \"8ac1b1cc-8499-493f-a8d9-801eb433163f\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.249657 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg86h\" (UniqueName: \"kubernetes.io/projected/8ac1b1cc-8499-493f-a8d9-801eb433163f-kube-api-access-vg86h\") pod \"8ac1b1cc-8499-493f-a8d9-801eb433163f\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.249688 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.249707 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert\") pod \"8ac1b1cc-8499-493f-a8d9-801eb433163f\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.249726 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-client-ca\") pod \"8ac1b1cc-8499-493f-a8d9-801eb433163f\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.250143 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbtv5\" (UniqueName: \"kubernetes.io/projected/99a90707-df7a-4c5f-9502-47f5eaafa320-kube-api-access-mbtv5\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.250237 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99a90707-df7a-4c5f-9502-47f5eaafa320-serving-cert\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.251580 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-client-ca" (OuterVolumeSpecName: "client-ca") pod "8ac1b1cc-8499-493f-a8d9-801eb433163f" (UID: "8ac1b1cc-8499-493f-a8d9-801eb433163f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.252043 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-config" (OuterVolumeSpecName: "config") pod "8ac1b1cc-8499-493f-a8d9-801eb433163f" (UID: "8ac1b1cc-8499-493f-a8d9-801eb433163f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.252144 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.752124367 +0000 UTC m=+216.352222797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253366 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-config\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253483 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-proxy-ca-bundles\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253549 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253597 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-client-ca\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253684 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54a768f3-aa53-481d-b179-5c8807f69e89-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253699 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253712 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zjz9\" (UniqueName: \"kubernetes.io/projected/54a768f3-aa53-481d-b179-5c8807f69e89-kube-api-access-9zjz9\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253725 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253852 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76ff476bcc-pgmwb"] Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.255958 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.755943631 +0000 UTC m=+216.356042061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.268283 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8ac1b1cc-8499-493f-a8d9-801eb433163f" (UID: "8ac1b1cc-8499-493f-a8d9-801eb433163f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.268431 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac1b1cc-8499-493f-a8d9-801eb433163f-kube-api-access-vg86h" (OuterVolumeSpecName: "kube-api-access-vg86h") pod "8ac1b1cc-8499-493f-a8d9-801eb433163f" (UID: "8ac1b1cc-8499-493f-a8d9-801eb433163f"). InnerVolumeSpecName "kube-api-access-vg86h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.274928 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.279206 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b","Type":"ContainerStarted","Data":"5b6b697b43a3ca9aed435659be5a4adfa260345d670f3e9fc7b2402ed1c8de07"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.283643 4983 generic.go:334] "Generic (PLEG): container finished" podID="54a768f3-aa53-481d-b179-5c8807f69e89" containerID="fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756" exitCode=0 Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.283742 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.284235 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" event={"ID":"54a768f3-aa53-481d-b179-5c8807f69e89","Type":"ContainerDied","Data":"fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.284328 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" event={"ID":"54a768f3-aa53-481d-b179-5c8807f69e89","Type":"ContainerDied","Data":"b3aab4335c7ccfdf2161f33d50cd419255438398c3dcb373c9fee2523012c9eb"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.284389 4983 scope.go:117] "RemoveContainer" containerID="fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.293524 4983 generic.go:334] "Generic (PLEG): container finished" podID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerID="1fc80a9e4fb01c05cb775f45190ece9037ca337a03452dd8abf5a08dd242d1da" exitCode=0 Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.293600 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxnxc" event={"ID":"f617dbbc-f757-49b9-b8c6-7d0c07cb197e","Type":"ContainerDied","Data":"1fc80a9e4fb01c05cb775f45190ece9037ca337a03452dd8abf5a08dd242d1da"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.293624 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxnxc" event={"ID":"f617dbbc-f757-49b9-b8c6-7d0c07cb197e","Type":"ContainerStarted","Data":"560de43e4286295f7d460584c287dcc2fb86a8274cec5a292335c9438faa954b"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.313853 4983 generic.go:334] "Generic (PLEG): container finished" podID="8ac1b1cc-8499-493f-a8d9-801eb433163f" containerID="8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013" exitCode=0 Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.313924 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" event={"ID":"8ac1b1cc-8499-493f-a8d9-801eb433163f","Type":"ContainerDied","Data":"8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.313950 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" event={"ID":"8ac1b1cc-8499-493f-a8d9-801eb433163f","Type":"ContainerDied","Data":"8665fea52c7fa59500f204d3f9650e9dc31f16bb6f517767fe49fb869cd3f2b0"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.314005 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.322055 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerStarted","Data":"10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.322092 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerStarted","Data":"6ff5c36eac345013e6cc957efaa73b943a59621eebe35f0b43b2431024b1cecb"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.333013 4983 scope.go:117] "RemoveContainer" containerID="fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.333722 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756\": container with ID starting with fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756 not found: ID does not exist" containerID="fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.333772 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756"} err="failed to get container status \"fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756\": rpc error: code = NotFound desc = could not find container \"fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756\": container with ID starting with fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756 not found: ID does not exist" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.333799 4983 scope.go:117] "RemoveContainer" containerID="8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.347313 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sv5g7"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.347843 4983 generic.go:334] "Generic (PLEG): container finished" podID="0153d604-68c6-465e-9714-463f0e7e4c41" containerID="81c835875b0da5ad00c0eef0ef68928bd1f88ad221a8ad83b11565521d53a877" exitCode=0 Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.347913 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" event={"ID":"0153d604-68c6-465e-9714-463f0e7e4c41","Type":"ContainerDied","Data":"81c835875b0da5ad00c0eef0ef68928bd1f88ad221a8ad83b11565521d53a877"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360191 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9hbqr"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360641 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360867 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-config\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360904 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-proxy-ca-bundles\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360940 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-client-ca\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360963 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbtv5\" (UniqueName: \"kubernetes.io/projected/99a90707-df7a-4c5f-9502-47f5eaafa320-kube-api-access-mbtv5\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360983 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdbwp\" (UniqueName: \"kubernetes.io/projected/5a30db24-326a-4f24-8ea0-e3d1367a2b76-kube-api-access-mdbwp\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.361001 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99a90707-df7a-4c5f-9502-47f5eaafa320-serving-cert\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.361022 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-client-ca\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.361044 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a30db24-326a-4f24-8ea0-e3d1367a2b76-serving-cert\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.361072 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-config\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.361117 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.361127 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg86h\" (UniqueName: \"kubernetes.io/projected/8ac1b1cc-8499-493f-a8d9-801eb433163f-kube-api-access-vg86h\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.361821 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.861806741 +0000 UTC m=+216.461905171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.363110 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-config\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.364109 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-proxy-ca-bundles\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.367413 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9hbqr"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.367572 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerStarted","Data":"2c8a01779fdf7320586832f975808a3323314fc1dee647ee11f25e6ca498d9a4"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.368706 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerStarted","Data":"de21ac29d1b3f85746eecc6275790d886e43e62e160f35ab6e888afb27d08a5c"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.368973 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99a90707-df7a-4c5f-9502-47f5eaafa320-serving-cert\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.369531 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-client-ca\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.393564 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbtv5\" (UniqueName: \"kubernetes.io/projected/99a90707-df7a-4c5f-9502-47f5eaafa320-kube-api-access-mbtv5\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.411528 4983 scope.go:117] "RemoveContainer" containerID="8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.412096 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013\": container with ID starting with 8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013 not found: ID does not exist" containerID="8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.412136 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013"} err="failed to get container status \"8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013\": rpc error: code = NotFound desc = could not find container \"8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013\": container with ID starting with 8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013 not found: ID does not exist" Mar 16 00:10:07 crc kubenswrapper[4983]: W0316 00:10:07.414845 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6bd9bf5_fa59_4fef_9589_7b5865098bd2.slice/crio-aae8cc96a35a149bdefbed630e67440f7417544a5d2fbe7864d479595393b42e WatchSource:0}: Error finding container aae8cc96a35a149bdefbed630e67440f7417544a5d2fbe7864d479595393b42e: Status 404 returned error can't find the container with id aae8cc96a35a149bdefbed630e67440f7417544a5d2fbe7864d479595393b42e Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.430502 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.432812 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.461703 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.461984 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdbwp\" (UniqueName: \"kubernetes.io/projected/5a30db24-326a-4f24-8ea0-e3d1367a2b76-kube-api-access-mdbwp\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.462021 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-client-ca\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.462045 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a30db24-326a-4f24-8ea0-e3d1367a2b76-serving-cert\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.462074 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-config\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.462105 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.962091424 +0000 UTC m=+216.562189844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.462872 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-client-ca\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.463178 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-config\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.476321 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a30db24-326a-4f24-8ea0-e3d1367a2b76-serving-cert\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.479796 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdbwp\" (UniqueName: \"kubernetes.io/projected/5a30db24-326a-4f24-8ea0-e3d1367a2b76-kube-api-access-mdbwp\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.555999 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.563131 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.564269 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.064238843 +0000 UTC m=+216.664337283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.580572 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.636043 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b68d7"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.637639 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.642048 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b68d7"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.665531 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-utilities\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.665582 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.665668 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kmd9\" (UniqueName: \"kubernetes.io/projected/cbebf69d-773f-4829-a4ec-e443d52ef275-kube-api-access-6kmd9\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.665705 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-catalog-content\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.666089 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.166072492 +0000 UTC m=+216.766170922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.678779 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.700279 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.704173 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.707514 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.708368 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.733738 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.760258 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.767404 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.767612 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-catalog-content\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.767715 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.267687965 +0000 UTC m=+216.867786395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.767801 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.767892 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-utilities\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.767923 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.768080 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kmd9\" (UniqueName: \"kubernetes.io/projected/cbebf69d-773f-4829-a4ec-e443d52ef275-kube-api-access-6kmd9\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.768113 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.768582 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-utilities\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.768831 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.268824619 +0000 UTC m=+216.868923049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.769311 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-catalog-content\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.799569 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kmd9\" (UniqueName: \"kubernetes.io/projected/cbebf69d-773f-4829-a4ec-e443d52ef275-kube-api-access-6kmd9\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.827512 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.827604 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76ff476bcc-pgmwb"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.834578 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:07 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:07 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:07 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.834628 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.870199 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.870463 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.870504 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.871435 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.871874 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.371840484 +0000 UTC m=+216.971938914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.881242 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.881307 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.881621 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.881635 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.892528 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.936625 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44280: no serving certificate available for the kubelet" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.971851 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.972275 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.472262511 +0000 UTC m=+217.072360941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.002342 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7"] Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.017445 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.025143 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.051373 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kjc2w"] Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.057958 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjc2w"] Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.058086 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.073585 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.074039 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.574020818 +0000 UTC m=+217.174119248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.106053 4983 patch_prober.go:28] interesting pod/console-f9d7485db-fp4l5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.106105 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-fp4l5" podUID="d76474c2-7d5c-45a0-8869-d829b0c594d6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.115688 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" path="/var/lib/kubelet/pods/54a768f3-aa53-481d-b179-5c8807f69e89/volumes" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.116665 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ac1b1cc-8499-493f-a8d9-801eb433163f" path="/var/lib/kubelet/pods/8ac1b1cc-8499-493f-a8d9-801eb433163f/volumes" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.117811 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.117839 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.117937 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.132897 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.176652 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-utilities\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.176720 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8x8w\" (UniqueName: \"kubernetes.io/projected/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-kube-api-access-r8x8w\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.176770 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-catalog-content\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.176915 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.180404 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.680388693 +0000 UTC m=+217.280487193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.279339 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.279655 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8x8w\" (UniqueName: \"kubernetes.io/projected/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-kube-api-access-r8x8w\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.279692 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-catalog-content\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.279858 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-utilities\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.280399 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-utilities\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.280519 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.780478531 +0000 UTC m=+217.380576961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.281256 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-catalog-content\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.304622 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8x8w\" (UniqueName: \"kubernetes.io/projected/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-kube-api-access-r8x8w\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.380969 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.381464 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.881451834 +0000 UTC m=+217.481550264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.387987 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.407027 4983 generic.go:334] "Generic (PLEG): container finished" podID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerID="7455d52b296ac2dc05d5dba007a96face87721af18e58d348eedd55fbc4a2082" exitCode=0 Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.407697 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv5g7" event={"ID":"b6bd9bf5-fa59-4fef-9589-7b5865098bd2","Type":"ContainerDied","Data":"7455d52b296ac2dc05d5dba007a96face87721af18e58d348eedd55fbc4a2082"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.407750 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv5g7" event={"ID":"b6bd9bf5-fa59-4fef-9589-7b5865098bd2","Type":"ContainerStarted","Data":"aae8cc96a35a149bdefbed630e67440f7417544a5d2fbe7864d479595393b42e"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.419102 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" event={"ID":"5a30db24-326a-4f24-8ea0-e3d1367a2b76","Type":"ContainerStarted","Data":"750a85be2f0f629cc184ac0a4c018b832bba1ef5898acd3b3254238edafdcee9"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.430494 4983 generic.go:334] "Generic (PLEG): container finished" podID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerID="2c8a01779fdf7320586832f975808a3323314fc1dee647ee11f25e6ca498d9a4" exitCode=0 Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.430829 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerDied","Data":"2c8a01779fdf7320586832f975808a3323314fc1dee647ee11f25e6ca498d9a4"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.434524 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.437661 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.449279 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b","Type":"ContainerStarted","Data":"f9b599001c13f639d451ceabf88f4a53c98624ba99597dffc75a2261d1939597"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.451589 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" event={"ID":"99a90707-df7a-4c5f-9502-47f5eaafa320","Type":"ContainerStarted","Data":"75bf14131d5b8d3db0d67d7f812d7d6f097de077cb0e31a121d0d18e80488d4e"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.453651 4983 generic.go:334] "Generic (PLEG): container finished" podID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerID="10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c" exitCode=0 Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.455131 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerDied","Data":"10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.483358 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.484495 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.984473709 +0000 UTC m=+217.584572139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.496024 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b68d7"] Mar 16 00:10:08 crc kubenswrapper[4983]: W0316 00:10:08.540921 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbebf69d_773f_4829_a4ec_e443d52ef275.slice/crio-5fbb0356673aa199b061055bf122df8d3c4f8bc1dc9d0dbf904e99d7873ede45 WatchSource:0}: Error finding container 5fbb0356673aa199b061055bf122df8d3c4f8bc1dc9d0dbf904e99d7873ede45: Status 404 returned error can't find the container with id 5fbb0356673aa199b061055bf122df8d3c4f8bc1dc9d0dbf904e99d7873ede45 Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.558072 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.580429 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.580403553 podStartE2EDuration="3.580403553s" podCreationTimestamp="2026-03-16 00:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:08.579181966 +0000 UTC m=+217.179280396" watchObservedRunningTime="2026-03-16 00:10:08.580403553 +0000 UTC m=+217.180501973" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.585120 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.585508 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.085496225 +0000 UTC m=+217.685594655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.640028 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-56c2t"] Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.641356 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.646036 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.647718 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56c2t"] Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.686989 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.687420 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-utilities\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.687482 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-catalog-content\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.687567 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msk49\" (UniqueName: \"kubernetes.io/projected/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-kube-api-access-msk49\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.687878 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.18786254 +0000 UTC m=+217.787960970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.788632 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-utilities\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.788711 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-catalog-content\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.788772 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.788820 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msk49\" (UniqueName: \"kubernetes.io/projected/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-kube-api-access-msk49\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.789537 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-utilities\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.789818 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-catalog-content\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.790032 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.290021379 +0000 UTC m=+217.890119809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.811180 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msk49\" (UniqueName: \"kubernetes.io/projected/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-kube-api-access-msk49\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.819937 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.822462 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:08 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:08 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:08 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.822503 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.839939 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.840007 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.853916 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.889776 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0153d604-68c6-465e-9714-463f0e7e4c41-config-volume\") pod \"0153d604-68c6-465e-9714-463f0e7e4c41\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.889846 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0153d604-68c6-465e-9714-463f0e7e4c41-secret-volume\") pod \"0153d604-68c6-465e-9714-463f0e7e4c41\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.890178 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.890255 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx99q\" (UniqueName: \"kubernetes.io/projected/0153d604-68c6-465e-9714-463f0e7e4c41-kube-api-access-cx99q\") pod \"0153d604-68c6-465e-9714-463f0e7e4c41\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.891038 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.391009933 +0000 UTC m=+217.991108353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.893078 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0153d604-68c6-465e-9714-463f0e7e4c41-config-volume" (OuterVolumeSpecName: "config-volume") pod "0153d604-68c6-465e-9714-463f0e7e4c41" (UID: "0153d604-68c6-465e-9714-463f0e7e4c41"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.897804 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0153d604-68c6-465e-9714-463f0e7e4c41-kube-api-access-cx99q" (OuterVolumeSpecName: "kube-api-access-cx99q") pod "0153d604-68c6-465e-9714-463f0e7e4c41" (UID: "0153d604-68c6-465e-9714-463f0e7e4c41"). InnerVolumeSpecName "kube-api-access-cx99q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.897820 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0153d604-68c6-465e-9714-463f0e7e4c41-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0153d604-68c6-465e-9714-463f0e7e4c41" (UID: "0153d604-68c6-465e-9714-463f0e7e4c41"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.986833 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.997236 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.997362 4983 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0153d604-68c6-465e-9714-463f0e7e4c41-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.997376 4983 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0153d604-68c6-465e-9714-463f0e7e4c41-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.997386 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx99q\" (UniqueName: \"kubernetes.io/projected/0153d604-68c6-465e-9714-463f0e7e4c41-kube-api-access-cx99q\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.997687 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.497671567 +0000 UTC m=+218.097769997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.014649 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjc2w"] Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.060184 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7qx9g"] Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.060390 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0153d604-68c6-465e-9714-463f0e7e4c41" containerName="collect-profiles" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.060404 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="0153d604-68c6-465e-9714-463f0e7e4c41" containerName="collect-profiles" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.060488 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="0153d604-68c6-465e-9714-463f0e7e4c41" containerName="collect-profiles" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.061329 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.072974 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qx9g"] Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.098897 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.101386 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.60135441 +0000 UTC m=+218.201452840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.101525 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8nr2\" (UniqueName: \"kubernetes.io/projected/7bc03354-3cba-40ac-a894-844d6ae1ee69-kube-api-access-x8nr2\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.101662 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-utilities\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.101734 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-catalog-content\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.203323 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-catalog-content\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.203842 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.203888 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8nr2\" (UniqueName: \"kubernetes.io/projected/7bc03354-3cba-40ac-a894-844d6ae1ee69-kube-api-access-x8nr2\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.203927 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-utilities\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.204830 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.704810148 +0000 UTC m=+218.304908578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.205353 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-catalog-content\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.206553 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-utilities\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.229180 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8nr2\" (UniqueName: \"kubernetes.io/projected/7bc03354-3cba-40ac-a894-844d6ae1ee69-kube-api-access-x8nr2\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.304945 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.305136 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.805103422 +0000 UTC m=+218.405201852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.305322 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.305704 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.805688849 +0000 UTC m=+218.405787279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.321568 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56c2t"] Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.399374 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.406603 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.406730 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.906704644 +0000 UTC m=+218.506803064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.407033 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.407355 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.907347764 +0000 UTC m=+218.507446194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.464248 4983 generic.go:334] "Generic (PLEG): container finished" podID="1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b" containerID="f9b599001c13f639d451ceabf88f4a53c98624ba99597dffc75a2261d1939597" exitCode=0 Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.464323 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b","Type":"ContainerDied","Data":"f9b599001c13f639d451ceabf88f4a53c98624ba99597dffc75a2261d1939597"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.471844 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" event={"ID":"99a90707-df7a-4c5f-9502-47f5eaafa320","Type":"ContainerStarted","Data":"3c6480b89b144d534c7c70c8010083a74e435b419d50a08c3805a800806b67d8"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.472912 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.475177 4983 generic.go:334] "Generic (PLEG): container finished" podID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerID="b832baa9ad863d92bef0f4bd68918c75a656cd7a0c7e14efd5e15110ac3d6de8" exitCode=0 Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.475229 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b68d7" event={"ID":"cbebf69d-773f-4829-a4ec-e443d52ef275","Type":"ContainerDied","Data":"b832baa9ad863d92bef0f4bd68918c75a656cd7a0c7e14efd5e15110ac3d6de8"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.475248 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b68d7" event={"ID":"cbebf69d-773f-4829-a4ec-e443d52ef275","Type":"ContainerStarted","Data":"5fbb0356673aa199b061055bf122df8d3c4f8bc1dc9d0dbf904e99d7873ede45"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.477277 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" event={"ID":"0153d604-68c6-465e-9714-463f0e7e4c41","Type":"ContainerDied","Data":"2f6ac418ab83db7361af1e5d0897d96c9e84cd20e3d27e7aa8176847f1f3a492"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.477307 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f6ac418ab83db7361af1e5d0897d96c9e84cd20e3d27e7aa8176847f1f3a492" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.477381 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.496845 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerStarted","Data":"edfb4c106db9ff156e89258c7be736e143b651348ae2eece9c28a73c16f1a791"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.499075 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.506818 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerStarted","Data":"839c30c9cbe107a7c9f0dd7cc6175826e37c3a950a4d5a9be034e934974f0bc3"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.508838 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.509012 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.008980917 +0000 UTC m=+218.609079357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.509533 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.510693 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.010678608 +0000 UTC m=+218.610777118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.516737 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" event={"ID":"5a30db24-326a-4f24-8ea0-e3d1367a2b76","Type":"ContainerStarted","Data":"6c0b025bd820fc741cfadf00ec9f111d65642da96c73116d19a68bdde2913908"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.517573 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.521664 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"385ebee4-3c06-4dcb-89d4-999ba793a9ba","Type":"ContainerStarted","Data":"4e388f80539aba9aecacdcf41ea98fd1759f1315a9da531e5c2e5ed8b94369f5"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.521713 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"385ebee4-3c06-4dcb-89d4-999ba793a9ba","Type":"ContainerStarted","Data":"cfb360083edf9d08a20272d5f7ca0aec35055c4e7e0874048d95f64598422a3b"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.523122 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.542925 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" podStartSLOduration=2.542901749 podStartE2EDuration="2.542901749s" podCreationTimestamp="2026-03-16 00:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:09.541773846 +0000 UTC m=+218.141872276" watchObservedRunningTime="2026-03-16 00:10:09.542901749 +0000 UTC m=+218.143000189" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.543680 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" podStartSLOduration=2.543672732 podStartE2EDuration="2.543672732s" podCreationTimestamp="2026-03-16 00:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:09.521530162 +0000 UTC m=+218.121628592" watchObservedRunningTime="2026-03-16 00:10:09.543672732 +0000 UTC m=+218.143771162" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.610684 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.612509 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.112488526 +0000 UTC m=+218.712586966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.715636 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.716153 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.21613738 +0000 UTC m=+218.816235840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.816567 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.816997 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.31697737 +0000 UTC m=+218.917075800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.829222 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:09 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:09 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:09 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.829284 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.832443 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qx9g"] Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.917745 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.918148 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.418129699 +0000 UTC m=+219.018228129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.021050 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.021573 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.521541165 +0000 UTC m=+219.121639595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.114007 4983 patch_prober.go:28] interesting pod/apiserver-76f77b778f-lc9bv container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]log ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]etcd ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/generic-apiserver-start-informers ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/max-in-flight-filter ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 16 00:10:10 crc kubenswrapper[4983]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 16 00:10:10 crc kubenswrapper[4983]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/project.openshift.io-projectcache ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 16 00:10:10 crc kubenswrapper[4983]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 16 00:10:10 crc kubenswrapper[4983]: livez check failed Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.114079 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" podUID="249f0516-0237-4ba3-92eb-a7aa3b9c62c1" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.122101 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.122503 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.622490979 +0000 UTC m=+219.222589409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.223370 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.723350069 +0000 UTC m=+219.323448499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.223574 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.223849 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.224113 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.724105161 +0000 UTC m=+219.324203591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.325328 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.325520 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.825494498 +0000 UTC m=+219.425592928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.325633 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.325931 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.82591921 +0000 UTC m=+219.426017640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.426318 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.426503 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.926477962 +0000 UTC m=+219.526576392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.426898 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.926891384 +0000 UTC m=+219.526989814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.426952 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.443538 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mjkh8" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.517851 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44292: no serving certificate available for the kubelet" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.527501 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.527696 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.027670322 +0000 UTC m=+219.627768752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.527806 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.528088 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.028076454 +0000 UTC m=+219.628174884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.532733 4983 generic.go:334] "Generic (PLEG): container finished" podID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerID="27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32" exitCode=0 Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.532821 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerDied","Data":"27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32"} Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.535275 4983 generic.go:334] "Generic (PLEG): container finished" podID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerID="0601a98e47222baf45860438cfc29d0447fa64cf46cd7bead9a6ef97f07beb9c" exitCode=0 Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.535344 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerDied","Data":"0601a98e47222baf45860438cfc29d0447fa64cf46cd7bead9a6ef97f07beb9c"} Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.537924 4983 generic.go:334] "Generic (PLEG): container finished" podID="385ebee4-3c06-4dcb-89d4-999ba793a9ba" containerID="4e388f80539aba9aecacdcf41ea98fd1759f1315a9da531e5c2e5ed8b94369f5" exitCode=0 Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.538014 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"385ebee4-3c06-4dcb-89d4-999ba793a9ba","Type":"ContainerDied","Data":"4e388f80539aba9aecacdcf41ea98fd1759f1315a9da531e5c2e5ed8b94369f5"} Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.541363 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerStarted","Data":"66c02382f4884cf7432e8b1dd2d9aae721248d87c7cd3a1bce60e42991bb56c4"} Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.629948 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.630098 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.130077009 +0000 UTC m=+219.730175439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.630584 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.631744 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.131729308 +0000 UTC m=+219.731827738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.733772 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.734052 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.233964329 +0000 UTC m=+219.834062759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.734112 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.734464 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.234447794 +0000 UTC m=+219.834546224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.809922 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.823786 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:10 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:10 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:10 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.823850 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.835576 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.835872 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.335858951 +0000 UTC m=+219.935957381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.936455 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kubelet-dir\") pod \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.936861 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kube-api-access\") pod \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.936879 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b" (UID: "1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.937206 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.937627 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.437612558 +0000 UTC m=+220.037710988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.937935 4983 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.942693 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b" (UID: "1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.039459 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.039697 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.539654683 +0000 UTC m=+220.139753113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.040103 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.040189 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.040553 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.54053988 +0000 UTC m=+220.140638360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.140895 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.141434 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.6413945 +0000 UTC m=+220.241492930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.242833 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.243405 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.743386334 +0000 UTC m=+220.343484774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.345131 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.345367 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.845338767 +0000 UTC m=+220.445437197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.345963 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.346412 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.846400969 +0000 UTC m=+220.446499399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.446732 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.447012 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.946980481 +0000 UTC m=+220.547078921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.447099 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.447367 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.947354822 +0000 UTC m=+220.547453252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.539886 4983 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.548839 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.549027 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.049005616 +0000 UTC m=+220.649104046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.549064 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.550394 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.050376217 +0000 UTC m=+220.650474637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.553857 4983 generic.go:334] "Generic (PLEG): container finished" podID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerID="b9712062cb37f4ba2339e9dc2def8ff36e2a54d5fce9ebcc83e68db1e8c9e216" exitCode=0 Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.553935 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerDied","Data":"b9712062cb37f4ba2339e9dc2def8ff36e2a54d5fce9ebcc83e68db1e8c9e216"} Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.562413 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" event={"ID":"8820c8ae-e5d3-4c91-8724-ec666e783179","Type":"ContainerStarted","Data":"35c30c12a549c9f3a066c2d3d7362fbdedb473c53e36f73d0bb2b4532a71aa3e"} Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.562468 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" event={"ID":"8820c8ae-e5d3-4c91-8724-ec666e783179","Type":"ContainerStarted","Data":"6ed9f58a27e42d37ee961a0fda8db32a5fda0f9c1a37b58b3524532b2d28e46d"} Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.565729 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b","Type":"ContainerDied","Data":"5b6b697b43a3ca9aed435659be5a4adfa260345d670f3e9fc7b2402ed1c8de07"} Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.565775 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b6b697b43a3ca9aed435659be5a4adfa260345d670f3e9fc7b2402ed1c8de07" Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.565890 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.650860 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.651479 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.151233637 +0000 UTC m=+220.751332067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.652273 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.653851 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.153837475 +0000 UTC m=+220.753935895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.753400 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.753603 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.253577262 +0000 UTC m=+220.853675692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.753650 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.754060 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.254046076 +0000 UTC m=+220.854144506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.820967 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:11 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:11 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:11 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.821050 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.854739 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.855154 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.355134713 +0000 UTC m=+220.955233143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.955830 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.956181 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.456166158 +0000 UTC m=+221.056264588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.056816 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:12 crc kubenswrapper[4983]: E0316 00:10:12.056899 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.556873584 +0000 UTC m=+221.156972014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.057386 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:12 crc kubenswrapper[4983]: E0316 00:10:12.057795 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.557779061 +0000 UTC m=+221.157877491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.159902 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:12 crc kubenswrapper[4983]: E0316 00:10:12.160063 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.660036383 +0000 UTC m=+221.260134813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.160240 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:12 crc kubenswrapper[4983]: E0316 00:10:12.160548 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.660538268 +0000 UTC m=+221.260636698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.261688 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:12 crc kubenswrapper[4983]: E0316 00:10:12.261999 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.761984506 +0000 UTC m=+221.362082936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.310838 4983 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-16T00:10:11.539912035Z","Handler":null,"Name":""} Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.313048 4983 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.313074 4983 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.362665 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.365334 4983 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.365363 4983 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.413642 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.463742 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.470386 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.540009 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.574447 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" event={"ID":"8820c8ae-e5d3-4c91-8724-ec666e783179","Type":"ContainerStarted","Data":"9ee5bb4119e3b16046ba33eca7ca88e39672de7857fa0ee6fe3cdfffeb59f2f3"} Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.605314 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" podStartSLOduration=17.605289533 podStartE2EDuration="17.605289533s" podCreationTimestamp="2026-03-16 00:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:12.594535042 +0000 UTC m=+221.194633492" watchObservedRunningTime="2026-03-16 00:10:12.605289533 +0000 UTC m=+221.205387973" Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.821328 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:12 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:12 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:12 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.821408 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:13 crc kubenswrapper[4983]: I0316 00:10:13.230172 4983 ???:1] "http: TLS handshake error from 192.168.126.11:52122: no serving certificate available for the kubelet" Mar 16 00:10:13 crc kubenswrapper[4983]: I0316 00:10:13.820692 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:13 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:13 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:13 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:13 crc kubenswrapper[4983]: I0316 00:10:13.821179 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:13 crc kubenswrapper[4983]: I0316 00:10:13.844873 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:10:13 crc kubenswrapper[4983]: I0316 00:10:13.848765 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:10:14 crc kubenswrapper[4983]: I0316 00:10:14.100698 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 16 00:10:14 crc kubenswrapper[4983]: I0316 00:10:14.820125 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:14 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:14 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:14 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:14 crc kubenswrapper[4983]: I0316 00:10:14.820176 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:15 crc kubenswrapper[4983]: I0316 00:10:15.658679 4983 ???:1] "http: TLS handshake error from 192.168.126.11:52126: no serving certificate available for the kubelet" Mar 16 00:10:15 crc kubenswrapper[4983]: I0316 00:10:15.820597 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:15 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:15 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:15 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:15 crc kubenswrapper[4983]: I0316 00:10:15.821225 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:15 crc kubenswrapper[4983]: I0316 00:10:15.912652 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.015921 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kube-api-access\") pod \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.016583 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kubelet-dir\") pod \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.016728 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "385ebee4-3c06-4dcb-89d4-999ba793a9ba" (UID: "385ebee4-3c06-4dcb-89d4-999ba793a9ba"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.017133 4983 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.024889 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "385ebee4-3c06-4dcb-89d4-999ba793a9ba" (UID: "385ebee4-3c06-4dcb-89d4-999ba793a9ba"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.117969 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.609613 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"385ebee4-3c06-4dcb-89d4-999ba793a9ba","Type":"ContainerDied","Data":"cfb360083edf9d08a20272d5f7ca0aec35055c4e7e0874048d95f64598422a3b"} Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.609653 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfb360083edf9d08a20272d5f7ca0aec35055c4e7e0874048d95f64598422a3b" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.609676 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.832196 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:16 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:16 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:16 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.832270 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:17 crc kubenswrapper[4983]: I0316 00:10:17.821164 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:17 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:17 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:17 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:17 crc kubenswrapper[4983]: I0316 00:10:17.821253 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:17 crc kubenswrapper[4983]: I0316 00:10:17.880187 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:10:17 crc kubenswrapper[4983]: I0316 00:10:17.880220 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:10:17 crc kubenswrapper[4983]: I0316 00:10:17.880232 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:10:17 crc kubenswrapper[4983]: I0316 00:10:17.880270 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:10:18 crc kubenswrapper[4983]: I0316 00:10:18.096416 4983 patch_prober.go:28] interesting pod/console-f9d7485db-fp4l5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 16 00:10:18 crc kubenswrapper[4983]: I0316 00:10:18.096473 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-fp4l5" podUID="d76474c2-7d5c-45a0-8869-d829b0c594d6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 16 00:10:18 crc kubenswrapper[4983]: I0316 00:10:18.820398 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:18 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:18 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:18 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:18 crc kubenswrapper[4983]: I0316 00:10:18.820733 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:19 crc kubenswrapper[4983]: I0316 00:10:19.819877 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:19 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:19 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:19 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:19 crc kubenswrapper[4983]: I0316 00:10:19.819955 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:20 crc kubenswrapper[4983]: I0316 00:10:20.823211 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:20 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:20 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:20 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:20 crc kubenswrapper[4983]: I0316 00:10:20.823505 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:21 crc kubenswrapper[4983]: I0316 00:10:21.821434 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:10:21 crc kubenswrapper[4983]: I0316 00:10:21.824981 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:10:23 crc kubenswrapper[4983]: I0316 00:10:23.449171 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:10:23 crc kubenswrapper[4983]: I0316 00:10:23.449549 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:10:25 crc kubenswrapper[4983]: I0316 00:10:25.905191 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76ff476bcc-pgmwb"] Mar 16 00:10:25 crc kubenswrapper[4983]: I0316 00:10:25.905647 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" containerID="cri-o://3c6480b89b144d534c7c70c8010083a74e435b419d50a08c3805a800806b67d8" gracePeriod=30 Mar 16 00:10:25 crc kubenswrapper[4983]: I0316 00:10:25.919053 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7"] Mar 16 00:10:25 crc kubenswrapper[4983]: I0316 00:10:25.919248 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" containerID="cri-o://6c0b025bd820fc741cfadf00ec9f111d65642da96c73116d19a68bdde2913908" gracePeriod=30 Mar 16 00:10:26 crc kubenswrapper[4983]: E0316 00:10:26.754969 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 16 00:10:26 crc kubenswrapper[4983]: E0316 00:10:26.756661 4983 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:10:26 crc kubenswrapper[4983]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 16 00:10:26 crc kubenswrapper[4983]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gs7vq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29560328-sngnj_openshift-infra(9da42bf3-da76-4db7-9653-f2f08567084f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 16 00:10:26 crc kubenswrapper[4983]: > logger="UnhandledError" Mar 16 00:10:26 crc kubenswrapper[4983]: E0316 00:10:26.757804 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29560328-sngnj" podUID="9da42bf3-da76-4db7-9653-f2f08567084f" Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.557001 4983 patch_prober.go:28] interesting pod/controller-manager-76ff476bcc-pgmwb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.557074 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.581562 4983 patch_prober.go:28] interesting pod/route-controller-manager-74d65d8956-b8lr7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.581631 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.685557 4983 generic.go:334] "Generic (PLEG): container finished" podID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerID="6c0b025bd820fc741cfadf00ec9f111d65642da96c73116d19a68bdde2913908" exitCode=0 Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.685656 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" event={"ID":"5a30db24-326a-4f24-8ea0-e3d1367a2b76","Type":"ContainerDied","Data":"6c0b025bd820fc741cfadf00ec9f111d65642da96c73116d19a68bdde2913908"} Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.688205 4983 generic.go:334] "Generic (PLEG): container finished" podID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerID="3c6480b89b144d534c7c70c8010083a74e435b419d50a08c3805a800806b67d8" exitCode=0 Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.688295 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" event={"ID":"99a90707-df7a-4c5f-9502-47f5eaafa320","Type":"ContainerDied","Data":"3c6480b89b144d534c7c70c8010083a74e435b419d50a08c3805a800806b67d8"} Mar 16 00:10:27 crc kubenswrapper[4983]: E0316 00:10:27.689884 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29560328-sngnj" podUID="9da42bf3-da76-4db7-9653-f2f08567084f" Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.891024 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:10:28 crc kubenswrapper[4983]: I0316 00:10:28.102348 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:10:28 crc kubenswrapper[4983]: I0316 00:10:28.106822 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:10:29 crc kubenswrapper[4983]: E0316 00:10:29.203310 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 16 00:10:29 crc kubenswrapper[4983]: E0316 00:10:29.203463 4983 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:10:29 crc kubenswrapper[4983]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 16 00:10:29 crc kubenswrapper[4983]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtp9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29560330-65dr5_openshift-infra(c39b8480-5521-4ff7-b6ec-4f67009b1f5c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 16 00:10:29 crc kubenswrapper[4983]: > logger="UnhandledError" Mar 16 00:10:29 crc kubenswrapper[4983]: E0316 00:10:29.204675 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29560330-65dr5" podUID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" Mar 16 00:10:29 crc kubenswrapper[4983]: E0316 00:10:29.704150 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29560330-65dr5" podUID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" Mar 16 00:10:36 crc kubenswrapper[4983]: I0316 00:10:36.159837 4983 ???:1] "http: TLS handshake error from 192.168.126.11:41342: no serving certificate available for the kubelet" Mar 16 00:10:37 crc kubenswrapper[4983]: I0316 00:10:37.556850 4983 patch_prober.go:28] interesting pod/controller-manager-76ff476bcc-pgmwb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 16 00:10:37 crc kubenswrapper[4983]: I0316 00:10:37.556913 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 16 00:10:37 crc kubenswrapper[4983]: I0316 00:10:37.581490 4983 patch_prober.go:28] interesting pod/route-controller-manager-74d65d8956-b8lr7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 16 00:10:37 crc kubenswrapper[4983]: I0316 00:10:37.581534 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4983]: I0316 00:10:38.620966 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.895213 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 16 00:10:40 crc kubenswrapper[4983]: E0316 00:10:40.895817 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385ebee4-3c06-4dcb-89d4-999ba793a9ba" containerName="pruner" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.895834 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="385ebee4-3c06-4dcb-89d4-999ba793a9ba" containerName="pruner" Mar 16 00:10:40 crc kubenswrapper[4983]: E0316 00:10:40.895846 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b" containerName="pruner" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.895853 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b" containerName="pruner" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.895963 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="385ebee4-3c06-4dcb-89d4-999ba793a9ba" containerName="pruner" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.895978 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b" containerName="pruner" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.896398 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.901812 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.902155 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.915219 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.985261 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b93405e1-68b8-43ab-9628-cfd937aeca3f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.985590 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b93405e1-68b8-43ab-9628-cfd937aeca3f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:41 crc kubenswrapper[4983]: I0316 00:10:41.088825 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b93405e1-68b8-43ab-9628-cfd937aeca3f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:41 crc kubenswrapper[4983]: I0316 00:10:41.088906 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b93405e1-68b8-43ab-9628-cfd937aeca3f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:41 crc kubenswrapper[4983]: I0316 00:10:41.089362 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b93405e1-68b8-43ab-9628-cfd937aeca3f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:41 crc kubenswrapper[4983]: I0316 00:10:41.114497 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b93405e1-68b8-43ab-9628-cfd937aeca3f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:41 crc kubenswrapper[4983]: I0316 00:10:41.222014 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:41 crc kubenswrapper[4983]: I0316 00:10:41.620232 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:45 crc kubenswrapper[4983]: E0316 00:10:45.476505 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3436131020/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 16 00:10:45 crc kubenswrapper[4983]: E0316 00:10:45.476926 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-msk49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-56c2t_openshift-marketplace(8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3436131020/2\": happened during read: context canceled" logger="UnhandledError" Mar 16 00:10:45 crc kubenswrapper[4983]: E0316 00:10:45.478167 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage3436131020/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-operators-56c2t" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.490566 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.491610 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.496861 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.539246 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-var-lock\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.539291 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kube-api-access\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.539413 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.640530 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kube-api-access\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.640633 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.640661 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-var-lock\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.640734 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-var-lock\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.641007 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.659539 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kube-api-access\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.827072 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:46 crc kubenswrapper[4983]: E0316 00:10:46.685185 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 16 00:10:46 crc kubenswrapper[4983]: E0316 00:10:46.685348 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xbbl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vxnxc_openshift-marketplace(f617dbbc-f757-49b9-b8c6-7d0c07cb197e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:10:46 crc kubenswrapper[4983]: E0316 00:10:46.686684 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vxnxc" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" Mar 16 00:10:48 crc kubenswrapper[4983]: I0316 00:10:48.556432 4983 patch_prober.go:28] interesting pod/controller-manager-76ff476bcc-pgmwb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:10:48 crc kubenswrapper[4983]: I0316 00:10:48.556515 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:10:48 crc kubenswrapper[4983]: I0316 00:10:48.581164 4983 patch_prober.go:28] interesting pod/route-controller-manager-74d65d8956-b8lr7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:10:48 crc kubenswrapper[4983]: I0316 00:10:48.581229 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:10:50 crc kubenswrapper[4983]: E0316 00:10:50.021966 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vxnxc" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" Mar 16 00:10:50 crc kubenswrapper[4983]: E0316 00:10:50.022036 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-56c2t" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.073333 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.101376 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a30db24-326a-4f24-8ea0-e3d1367a2b76-serving-cert\") pod \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.101451 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-client-ca\") pod \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.101517 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-config\") pod \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.101537 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdbwp\" (UniqueName: \"kubernetes.io/projected/5a30db24-326a-4f24-8ea0-e3d1367a2b76-kube-api-access-mdbwp\") pod \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.102664 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-client-ca" (OuterVolumeSpecName: "client-ca") pod "5a30db24-326a-4f24-8ea0-e3d1367a2b76" (UID: "5a30db24-326a-4f24-8ea0-e3d1367a2b76"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.102683 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-config" (OuterVolumeSpecName: "config") pod "5a30db24-326a-4f24-8ea0-e3d1367a2b76" (UID: "5a30db24-326a-4f24-8ea0-e3d1367a2b76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.107417 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a30db24-326a-4f24-8ea0-e3d1367a2b76-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5a30db24-326a-4f24-8ea0-e3d1367a2b76" (UID: "5a30db24-326a-4f24-8ea0-e3d1367a2b76"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.108033 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a30db24-326a-4f24-8ea0-e3d1367a2b76-kube-api-access-mdbwp" (OuterVolumeSpecName: "kube-api-access-mdbwp") pod "5a30db24-326a-4f24-8ea0-e3d1367a2b76" (UID: "5a30db24-326a-4f24-8ea0-e3d1367a2b76"). InnerVolumeSpecName "kube-api-access-mdbwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.109628 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt"] Mar 16 00:10:50 crc kubenswrapper[4983]: E0316 00:10:50.109869 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.109883 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.110032 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.110531 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.120524 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt"] Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.202728 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-config\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203036 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-client-ca\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203166 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4abd0e95-c153-4402-b4c0-447e8df8ef5e-serving-cert\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203252 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwzcp\" (UniqueName: \"kubernetes.io/projected/4abd0e95-c153-4402-b4c0-447e8df8ef5e-kube-api-access-fwzcp\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203391 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a30db24-326a-4f24-8ea0-e3d1367a2b76-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203489 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203574 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdbwp\" (UniqueName: \"kubernetes.io/projected/5a30db24-326a-4f24-8ea0-e3d1367a2b76-kube-api-access-mdbwp\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203636 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.305174 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-config\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.305259 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-client-ca\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.305367 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4abd0e95-c153-4402-b4c0-447e8df8ef5e-serving-cert\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.305401 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwzcp\" (UniqueName: \"kubernetes.io/projected/4abd0e95-c153-4402-b4c0-447e8df8ef5e-kube-api-access-fwzcp\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.306593 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-client-ca\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.309442 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4abd0e95-c153-4402-b4c0-447e8df8ef5e-serving-cert\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.319735 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-config\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.339817 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwzcp\" (UniqueName: \"kubernetes.io/projected/4abd0e95-c153-4402-b4c0-447e8df8ef5e-kube-api-access-fwzcp\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.446134 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.846183 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" event={"ID":"5a30db24-326a-4f24-8ea0-e3d1367a2b76","Type":"ContainerDied","Data":"750a85be2f0f629cc184ac0a4c018b832bba1ef5898acd3b3254238edafdcee9"} Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.846231 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.846243 4983 scope.go:117] "RemoveContainer" containerID="6c0b025bd820fc741cfadf00ec9f111d65642da96c73116d19a68bdde2913908" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.863844 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7"] Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.867019 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7"] Mar 16 00:10:52 crc kubenswrapper[4983]: I0316 00:10:52.098303 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" path="/var/lib/kubelet/pods/5a30db24-326a-4f24-8ea0-e3d1367a2b76/volumes" Mar 16 00:10:53 crc kubenswrapper[4983]: I0316 00:10:53.448247 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:10:53 crc kubenswrapper[4983]: I0316 00:10:53.448534 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:10:53 crc kubenswrapper[4983]: I0316 00:10:53.448581 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:10:53 crc kubenswrapper[4983]: I0316 00:10:53.449128 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:10:53 crc kubenswrapper[4983]: I0316 00:10:53.449177 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383" gracePeriod=600 Mar 16 00:10:55 crc kubenswrapper[4983]: E0316 00:10:55.068278 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 16 00:10:55 crc kubenswrapper[4983]: E0316 00:10:55.068458 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6kmd9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-b68d7_openshift-marketplace(cbebf69d-773f-4829-a4ec-e443d52ef275): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:10:55 crc kubenswrapper[4983]: E0316 00:10:55.069635 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-b68d7" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" Mar 16 00:10:55 crc kubenswrapper[4983]: E0316 00:10:55.083112 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 16 00:10:55 crc kubenswrapper[4983]: E0316 00:10:55.083235 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8x8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kjc2w_openshift-marketplace(00a4a2a2-9263-4b76-8294-fa9c4d918fc7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:10:55 crc kubenswrapper[4983]: E0316 00:10:55.084366 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kjc2w" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" Mar 16 00:10:58 crc kubenswrapper[4983]: E0316 00:10:58.110034 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kjc2w" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.156369 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:58 crc kubenswrapper[4983]: E0316 00:10:58.175948 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 16 00:10:58 crc kubenswrapper[4983]: E0316 00:10:58.176159 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bqln2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-txzqn_openshift-marketplace(ca55ad69-3f41-4d0c-8f86-83a583ff6fe4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:10:58 crc kubenswrapper[4983]: E0316 00:10:58.178345 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-txzqn" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.189773 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq"] Mar 16 00:10:58 crc kubenswrapper[4983]: E0316 00:10:58.190079 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.190103 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.190228 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.190726 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.199658 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq"] Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.205840 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-proxy-ca-bundles\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.206168 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lqdt\" (UniqueName: \"kubernetes.io/projected/9820571e-90e5-4a57-925f-6dee047d6c9d-kube-api-access-7lqdt\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.206200 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-client-ca\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.206261 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-config\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.206444 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9820571e-90e5-4a57-925f-6dee047d6c9d-serving-cert\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308303 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99a90707-df7a-4c5f-9502-47f5eaafa320-serving-cert\") pod \"99a90707-df7a-4c5f-9502-47f5eaafa320\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308341 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-config\") pod \"99a90707-df7a-4c5f-9502-47f5eaafa320\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308373 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-proxy-ca-bundles\") pod \"99a90707-df7a-4c5f-9502-47f5eaafa320\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308435 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-client-ca\") pod \"99a90707-df7a-4c5f-9502-47f5eaafa320\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308488 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbtv5\" (UniqueName: \"kubernetes.io/projected/99a90707-df7a-4c5f-9502-47f5eaafa320-kube-api-access-mbtv5\") pod \"99a90707-df7a-4c5f-9502-47f5eaafa320\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308649 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-proxy-ca-bundles\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308688 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lqdt\" (UniqueName: \"kubernetes.io/projected/9820571e-90e5-4a57-925f-6dee047d6c9d-kube-api-access-7lqdt\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308706 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-client-ca\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308731 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-config\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308788 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9820571e-90e5-4a57-925f-6dee047d6c9d-serving-cert\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.311723 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-client-ca" (OuterVolumeSpecName: "client-ca") pod "99a90707-df7a-4c5f-9502-47f5eaafa320" (UID: "99a90707-df7a-4c5f-9502-47f5eaafa320"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.311793 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-config" (OuterVolumeSpecName: "config") pod "99a90707-df7a-4c5f-9502-47f5eaafa320" (UID: "99a90707-df7a-4c5f-9502-47f5eaafa320"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.312380 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-client-ca\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.312666 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-config\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.312838 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sm6x"] Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.312933 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "99a90707-df7a-4c5f-9502-47f5eaafa320" (UID: "99a90707-df7a-4c5f-9502-47f5eaafa320"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.315411 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-proxy-ca-bundles\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.315885 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a90707-df7a-4c5f-9502-47f5eaafa320-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "99a90707-df7a-4c5f-9502-47f5eaafa320" (UID: "99a90707-df7a-4c5f-9502-47f5eaafa320"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.316125 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9820571e-90e5-4a57-925f-6dee047d6c9d-serving-cert\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.317877 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a90707-df7a-4c5f-9502-47f5eaafa320-kube-api-access-mbtv5" (OuterVolumeSpecName: "kube-api-access-mbtv5") pod "99a90707-df7a-4c5f-9502-47f5eaafa320" (UID: "99a90707-df7a-4c5f-9502-47f5eaafa320"). InnerVolumeSpecName "kube-api-access-mbtv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.329195 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lqdt\" (UniqueName: \"kubernetes.io/projected/9820571e-90e5-4a57-925f-6dee047d6c9d-kube-api-access-7lqdt\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.410669 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99a90707-df7a-4c5f-9502-47f5eaafa320-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.411282 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.411331 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.411352 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.411368 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbtv5\" (UniqueName: \"kubernetes.io/projected/99a90707-df7a-4c5f-9502-47f5eaafa320-kube-api-access-mbtv5\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.509901 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.558082 4983 patch_prober.go:28] interesting pod/controller-manager-76ff476bcc-pgmwb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.558173 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.893423 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" event={"ID":"99a90707-df7a-4c5f-9502-47f5eaafa320","Type":"ContainerDied","Data":"75bf14131d5b8d3db0d67d7f812d7d6f097de077cb0e31a121d0d18e80488d4e"} Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.893473 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.896011 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383" exitCode=0 Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.896134 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383"} Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.937223 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76ff476bcc-pgmwb"] Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.939241 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-76ff476bcc-pgmwb"] Mar 16 00:11:00 crc kubenswrapper[4983]: I0316 00:11:00.104535 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" path="/var/lib/kubelet/pods/99a90707-df7a-4c5f-9502-47f5eaafa320/volumes" Mar 16 00:11:00 crc kubenswrapper[4983]: E0316 00:11:00.622974 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 16 00:11:00 crc kubenswrapper[4983]: E0316 00:11:00.623548 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4gpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-sv5g7_openshift-marketplace(b6bd9bf5-fa59-4fef-9589-7b5865098bd2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:00 crc kubenswrapper[4983]: E0316 00:11:00.625116 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-sv5g7" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" Mar 16 00:11:01 crc kubenswrapper[4983]: E0316 00:11:01.059078 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 16 00:11:01 crc kubenswrapper[4983]: E0316 00:11:01.059247 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cm28s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hsgsl_openshift-marketplace(8fd3d4ca-4839-4327-8121-fe6ba21051da): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:01 crc kubenswrapper[4983]: E0316 00:11:01.060437 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hsgsl" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" Mar 16 00:11:04 crc kubenswrapper[4983]: E0316 00:11:04.973064 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-sv5g7" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" Mar 16 00:11:04 crc kubenswrapper[4983]: E0316 00:11:04.975254 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-txzqn" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" Mar 16 00:11:04 crc kubenswrapper[4983]: E0316 00:11:04.975446 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hsgsl" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" Mar 16 00:11:04 crc kubenswrapper[4983]: W0316 00:11:04.979926 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a099f86_8967_4361_bbbf_4dfa8385d2f2.slice/crio-e0fb578aeb69cdf828d396d7abaed36aa77a72628836d8b9f23c76675c3ee11f WatchSource:0}: Error finding container e0fb578aeb69cdf828d396d7abaed36aa77a72628836d8b9f23c76675c3ee11f: Status 404 returned error can't find the container with id e0fb578aeb69cdf828d396d7abaed36aa77a72628836d8b9f23c76675c3ee11f Mar 16 00:11:05 crc kubenswrapper[4983]: E0316 00:11:05.164438 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 16 00:11:05 crc kubenswrapper[4983]: E0316 00:11:05.164902 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8nr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7qx9g_openshift-marketplace(7bc03354-3cba-40ac-a894-844d6ae1ee69): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:05 crc kubenswrapper[4983]: E0316 00:11:05.166119 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7qx9g" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.272883 4983 scope.go:117] "RemoveContainer" containerID="3c6480b89b144d534c7c70c8010083a74e435b419d50a08c3805a800806b67d8" Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.560895 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.685286 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt"] Mar 16 00:11:05 crc kubenswrapper[4983]: W0316 00:11:05.703241 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4abd0e95_c153_4402_b4c0_447e8df8ef5e.slice/crio-ce5fe3e5b0b512c532291202764fa14d31963aa82f8479c829465bb5e42e3a4b WatchSource:0}: Error finding container ce5fe3e5b0b512c532291202764fa14d31963aa82f8479c829465bb5e42e3a4b: Status 404 returned error can't find the container with id ce5fe3e5b0b512c532291202764fa14d31963aa82f8479c829465bb5e42e3a4b Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.865621 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.921805 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq"] Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.948409 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b93405e1-68b8-43ab-9628-cfd937aeca3f","Type":"ContainerStarted","Data":"ca9f3bcfd67825a8be572eb5e49d99ffdc8f464436504835a72dd955b5d125ac"} Mar 16 00:11:05 crc kubenswrapper[4983]: W0316 00:11:05.964419 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9820571e_90e5_4a57_925f_6dee047d6c9d.slice/crio-0f80f8d77d5e7d52546396d855090da3e1caada3b0ec5bf14e1494148b61e7ef WatchSource:0}: Error finding container 0f80f8d77d5e7d52546396d855090da3e1caada3b0ec5bf14e1494148b61e7ef: Status 404 returned error can't find the container with id 0f80f8d77d5e7d52546396d855090da3e1caada3b0ec5bf14e1494148b61e7ef Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.980375 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef","Type":"ContainerStarted","Data":"bb9e1da16a29be893a6a9d10f13e6b9a3bf25b7bf35da6d8f078d76e4ab8219e"} Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.988037 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" event={"ID":"0a099f86-8967-4361-bbbf-4dfa8385d2f2","Type":"ContainerStarted","Data":"e0fb578aeb69cdf828d396d7abaed36aa77a72628836d8b9f23c76675c3ee11f"} Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.989485 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" event={"ID":"4abd0e95-c153-4402-b4c0-447e8df8ef5e","Type":"ContainerStarted","Data":"ce5fe3e5b0b512c532291202764fa14d31963aa82f8479c829465bb5e42e3a4b"} Mar 16 00:11:05 crc kubenswrapper[4983]: E0316 00:11:05.997298 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7qx9g" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" Mar 16 00:11:06 crc kubenswrapper[4983]: I0316 00:11:06.627203 4983 csr.go:261] certificate signing request csr-fkfcq is approved, waiting to be issued Mar 16 00:11:06 crc kubenswrapper[4983]: I0316 00:11:06.633443 4983 csr.go:257] certificate signing request csr-fkfcq is issued Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.000152 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef","Type":"ContainerStarted","Data":"9fde2814949dd21f55871ee57d9c0de0a132a8749d55cb58695f078937d0a417"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.001309 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" event={"ID":"0a099f86-8967-4361-bbbf-4dfa8385d2f2","Type":"ContainerStarted","Data":"b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.001434 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.002522 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-65dr5" event={"ID":"c39b8480-5521-4ff7-b6ec-4f67009b1f5c","Type":"ContainerStarted","Data":"76d2b798a64d4809150e865ba49cceb6346042cb22c2796d78469f6cd57fde6c"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.004329 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" event={"ID":"4abd0e95-c153-4402-b4c0-447e8df8ef5e","Type":"ContainerStarted","Data":"7e878b7ac1a94791722310700db59b2afda440870dd1e4bec364f135c03d7d6a"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.004555 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.005930 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" event={"ID":"9820571e-90e5-4a57-925f-6dee047d6c9d","Type":"ContainerStarted","Data":"29ae933b927660608a2257d29f2db898ec60860e11279ee76b7c688737efe62a"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.005957 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" event={"ID":"9820571e-90e5-4a57-925f-6dee047d6c9d","Type":"ContainerStarted","Data":"0f80f8d77d5e7d52546396d855090da3e1caada3b0ec5bf14e1494148b61e7ef"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.006792 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.008292 4983 generic.go:334] "Generic (PLEG): container finished" podID="9da42bf3-da76-4db7-9653-f2f08567084f" containerID="f1d9cd29662f3f229511dac637df41ff7b782921910c342dbfa3015d6466b383" exitCode=0 Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.008324 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560328-sngnj" event={"ID":"9da42bf3-da76-4db7-9653-f2f08567084f","Type":"ContainerDied","Data":"f1d9cd29662f3f229511dac637df41ff7b782921910c342dbfa3015d6466b383"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.011211 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.011260 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"a285b65caa99c1c0ba0c4deb9dc06267b724d77153088ef275808d94e8acc41c"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.011808 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.012783 4983 generic.go:334] "Generic (PLEG): container finished" podID="b93405e1-68b8-43ab-9628-cfd937aeca3f" containerID="71d1cd633bfe3af34262442e473b5136134787de19b07e78235e338e5e0f0440" exitCode=0 Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.012817 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b93405e1-68b8-43ab-9628-cfd937aeca3f","Type":"ContainerDied","Data":"71d1cd633bfe3af34262442e473b5136134787de19b07e78235e338e5e0f0440"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.020675 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=22.020658956 podStartE2EDuration="22.020658956s" podCreationTimestamp="2026-03-16 00:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:07.019178571 +0000 UTC m=+275.619277001" watchObservedRunningTime="2026-03-16 00:11:07.020658956 +0000 UTC m=+275.620757386" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.041055 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" podStartSLOduration=21.041033134 podStartE2EDuration="21.041033134s" podCreationTimestamp="2026-03-16 00:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:07.038051225 +0000 UTC m=+275.638149655" watchObservedRunningTime="2026-03-16 00:11:07.041033134 +0000 UTC m=+275.641131574" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.161662 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" podStartSLOduration=228.161645264 podStartE2EDuration="3m48.161645264s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:07.159539551 +0000 UTC m=+275.759637991" watchObservedRunningTime="2026-03-16 00:11:07.161645264 +0000 UTC m=+275.761743694" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.192539 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" podStartSLOduration=22.192523385 podStartE2EDuration="22.192523385s" podCreationTimestamp="2026-03-16 00:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:07.191095963 +0000 UTC m=+275.791194393" watchObservedRunningTime="2026-03-16 00:11:07.192523385 +0000 UTC m=+275.792621815" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.635375 4983 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-29 12:54:03.680179002 +0000 UTC Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.635637 4983 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6924h42m56.044544061s for next certificate rotation Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.021346 4983 generic.go:334] "Generic (PLEG): container finished" podID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerID="210bd7f5ab48e451b18cd186b0e612a0157714bee428a4d39d25cdd92c0f3eb0" exitCode=0 Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.021391 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxnxc" event={"ID":"f617dbbc-f757-49b9-b8c6-7d0c07cb197e","Type":"ContainerDied","Data":"210bd7f5ab48e451b18cd186b0e612a0157714bee428a4d39d25cdd92c0f3eb0"} Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.023357 4983 generic.go:334] "Generic (PLEG): container finished" podID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" containerID="76d2b798a64d4809150e865ba49cceb6346042cb22c2796d78469f6cd57fde6c" exitCode=0 Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.023445 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-65dr5" event={"ID":"c39b8480-5521-4ff7-b6ec-4f67009b1f5c","Type":"ContainerDied","Data":"76d2b798a64d4809150e865ba49cceb6346042cb22c2796d78469f6cd57fde6c"} Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.026451 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerStarted","Data":"4fd735d9c2a8af79e35b41af9d3f84d5c4faeb3f496099e9f47662ac9f90becf"} Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.341867 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.478589 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.489521 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.514360 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtp9b\" (UniqueName: \"kubernetes.io/projected/c39b8480-5521-4ff7-b6ec-4f67009b1f5c-kube-api-access-wtp9b\") pod \"c39b8480-5521-4ff7-b6ec-4f67009b1f5c\" (UID: \"c39b8480-5521-4ff7-b6ec-4f67009b1f5c\") " Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.520007 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39b8480-5521-4ff7-b6ec-4f67009b1f5c-kube-api-access-wtp9b" (OuterVolumeSpecName: "kube-api-access-wtp9b") pod "c39b8480-5521-4ff7-b6ec-4f67009b1f5c" (UID: "c39b8480-5521-4ff7-b6ec-4f67009b1f5c"). InnerVolumeSpecName "kube-api-access-wtp9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.616053 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs7vq\" (UniqueName: \"kubernetes.io/projected/9da42bf3-da76-4db7-9653-f2f08567084f-kube-api-access-gs7vq\") pod \"9da42bf3-da76-4db7-9653-f2f08567084f\" (UID: \"9da42bf3-da76-4db7-9653-f2f08567084f\") " Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.616129 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b93405e1-68b8-43ab-9628-cfd937aeca3f-kubelet-dir\") pod \"b93405e1-68b8-43ab-9628-cfd937aeca3f\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.616215 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b93405e1-68b8-43ab-9628-cfd937aeca3f-kube-api-access\") pod \"b93405e1-68b8-43ab-9628-cfd937aeca3f\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.616293 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b93405e1-68b8-43ab-9628-cfd937aeca3f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b93405e1-68b8-43ab-9628-cfd937aeca3f" (UID: "b93405e1-68b8-43ab-9628-cfd937aeca3f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.616804 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtp9b\" (UniqueName: \"kubernetes.io/projected/c39b8480-5521-4ff7-b6ec-4f67009b1f5c-kube-api-access-wtp9b\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.616840 4983 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b93405e1-68b8-43ab-9628-cfd937aeca3f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.619412 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b93405e1-68b8-43ab-9628-cfd937aeca3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b93405e1-68b8-43ab-9628-cfd937aeca3f" (UID: "b93405e1-68b8-43ab-9628-cfd937aeca3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.619679 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da42bf3-da76-4db7-9653-f2f08567084f-kube-api-access-gs7vq" (OuterVolumeSpecName: "kube-api-access-gs7vq") pod "9da42bf3-da76-4db7-9653-f2f08567084f" (UID: "9da42bf3-da76-4db7-9653-f2f08567084f"). InnerVolumeSpecName "kube-api-access-gs7vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.635998 4983 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 01:32:35.360868712 +0000 UTC Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.636026 4983 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6289h21m26.724844872s for next certificate rotation Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.718409 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs7vq\" (UniqueName: \"kubernetes.io/projected/9da42bf3-da76-4db7-9653-f2f08567084f-kube-api-access-gs7vq\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.718459 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b93405e1-68b8-43ab-9628-cfd937aeca3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.032950 4983 generic.go:334] "Generic (PLEG): container finished" podID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerID="4fd735d9c2a8af79e35b41af9d3f84d5c4faeb3f496099e9f47662ac9f90becf" exitCode=0 Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.033025 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerDied","Data":"4fd735d9c2a8af79e35b41af9d3f84d5c4faeb3f496099e9f47662ac9f90becf"} Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.037348 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560328-sngnj" event={"ID":"9da42bf3-da76-4db7-9653-f2f08567084f","Type":"ContainerDied","Data":"fde617a4855b193426c3b4102e81b29ab0d3e6c44d90e708f2f6bda3bb35ebf8"} Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.037400 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fde617a4855b193426c3b4102e81b29ab0d3e6c44d90e708f2f6bda3bb35ebf8" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.037361 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.043818 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b93405e1-68b8-43ab-9628-cfd937aeca3f","Type":"ContainerDied","Data":"ca9f3bcfd67825a8be572eb5e49d99ffdc8f464436504835a72dd955b5d125ac"} Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.043855 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca9f3bcfd67825a8be572eb5e49d99ffdc8f464436504835a72dd955b5d125ac" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.043871 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.045557 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxnxc" event={"ID":"f617dbbc-f757-49b9-b8c6-7d0c07cb197e","Type":"ContainerStarted","Data":"6c4af783e4992498667061cde045dc4adfbc3a2bb0d78f02971a5d9ab47a3c4e"} Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.048317 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.048452 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-65dr5" event={"ID":"c39b8480-5521-4ff7-b6ec-4f67009b1f5c","Type":"ContainerDied","Data":"7852b21a7717c8a01be82d6d5cde8dd356e30ed41d3089cd4c321389eb11b980"} Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.048505 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7852b21a7717c8a01be82d6d5cde8dd356e30ed41d3089cd4c321389eb11b980" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.088198 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vxnxc" podStartSLOduration=2.681238605 podStartE2EDuration="1m4.088178685s" podCreationTimestamp="2026-03-16 00:10:05 +0000 UTC" firstStartedPulling="2026-03-16 00:10:07.303089218 +0000 UTC m=+215.903187638" lastFinishedPulling="2026-03-16 00:11:08.710029288 +0000 UTC m=+277.310127718" observedRunningTime="2026-03-16 00:11:09.083745213 +0000 UTC m=+277.683843643" watchObservedRunningTime="2026-03-16 00:11:09.088178685 +0000 UTC m=+277.688277115" Mar 16 00:11:13 crc kubenswrapper[4983]: I0316 00:11:12.072912 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerStarted","Data":"ebb5761c41d710f62351daf37faae0c364dec6494e019b8e9b1984f3f34d560f"} Mar 16 00:11:13 crc kubenswrapper[4983]: I0316 00:11:12.091028 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-56c2t" podStartSLOduration=3.839430655 podStartE2EDuration="1m4.09100568s" podCreationTimestamp="2026-03-16 00:10:08 +0000 UTC" firstStartedPulling="2026-03-16 00:10:10.53965046 +0000 UTC m=+219.139748890" lastFinishedPulling="2026-03-16 00:11:10.791225485 +0000 UTC m=+279.391323915" observedRunningTime="2026-03-16 00:11:12.090281348 +0000 UTC m=+280.690379778" watchObservedRunningTime="2026-03-16 00:11:12.09100568 +0000 UTC m=+280.691104110" Mar 16 00:11:13 crc kubenswrapper[4983]: I0316 00:11:13.081177 4983 generic.go:334] "Generic (PLEG): container finished" podID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerID="b6acaa7dffa774e191a9bf342869bf819b4d039ee2bd145b14e03704f80e4abc" exitCode=0 Mar 16 00:11:13 crc kubenswrapper[4983]: I0316 00:11:13.081304 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b68d7" event={"ID":"cbebf69d-773f-4829-a4ec-e443d52ef275","Type":"ContainerDied","Data":"b6acaa7dffa774e191a9bf342869bf819b4d039ee2bd145b14e03704f80e4abc"} Mar 16 00:11:15 crc kubenswrapper[4983]: I0316 00:11:15.932686 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:11:15 crc kubenswrapper[4983]: I0316 00:11:15.932968 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:11:16 crc kubenswrapper[4983]: I0316 00:11:16.683614 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:11:16 crc kubenswrapper[4983]: I0316 00:11:16.783264 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:11:18 crc kubenswrapper[4983]: I0316 00:11:18.987953 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:11:18 crc kubenswrapper[4983]: I0316 00:11:18.988331 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:11:19 crc kubenswrapper[4983]: I0316 00:11:19.030739 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:11:19 crc kubenswrapper[4983]: I0316 00:11:19.162211 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:11:20 crc kubenswrapper[4983]: I0316 00:11:20.119950 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b68d7" event={"ID":"cbebf69d-773f-4829-a4ec-e443d52ef275","Type":"ContainerStarted","Data":"c7ebe52d3884cc61ad7b5da10a6ceb1f5607237405cd86dac48708bdae4fcb7d"} Mar 16 00:11:20 crc kubenswrapper[4983]: I0316 00:11:20.140840 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b68d7" podStartSLOduration=4.222317305 podStartE2EDuration="1m13.140822982s" podCreationTimestamp="2026-03-16 00:10:07 +0000 UTC" firstStartedPulling="2026-03-16 00:10:10.54636499 +0000 UTC m=+219.146463420" lastFinishedPulling="2026-03-16 00:11:19.464870667 +0000 UTC m=+288.064969097" observedRunningTime="2026-03-16 00:11:20.138370329 +0000 UTC m=+288.738468759" watchObservedRunningTime="2026-03-16 00:11:20.140822982 +0000 UTC m=+288.740921412" Mar 16 00:11:21 crc kubenswrapper[4983]: I0316 00:11:21.126444 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerStarted","Data":"0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf"} Mar 16 00:11:21 crc kubenswrapper[4983]: I0316 00:11:21.128034 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerStarted","Data":"86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404"} Mar 16 00:11:22 crc kubenswrapper[4983]: I0316 00:11:22.143565 4983 generic.go:334] "Generic (PLEG): container finished" podID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerID="0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf" exitCode=0 Mar 16 00:11:22 crc kubenswrapper[4983]: I0316 00:11:22.143769 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerDied","Data":"0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf"} Mar 16 00:11:22 crc kubenswrapper[4983]: I0316 00:11:22.146105 4983 generic.go:334] "Generic (PLEG): container finished" podID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerID="86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404" exitCode=0 Mar 16 00:11:22 crc kubenswrapper[4983]: I0316 00:11:22.146152 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerDied","Data":"86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404"} Mar 16 00:11:22 crc kubenswrapper[4983]: I0316 00:11:22.546239 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:11:25 crc kubenswrapper[4983]: I0316 00:11:25.904904 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq"] Mar 16 00:11:25 crc kubenswrapper[4983]: I0316 00:11:25.905631 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" podUID="9820571e-90e5-4a57-925f-6dee047d6c9d" containerName="controller-manager" containerID="cri-o://29ae933b927660608a2257d29f2db898ec60860e11279ee76b7c688737efe62a" gracePeriod=30 Mar 16 00:11:25 crc kubenswrapper[4983]: I0316 00:11:25.999873 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt"] Mar 16 00:11:26 crc kubenswrapper[4983]: I0316 00:11:26.000146 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" podUID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" containerName="route-controller-manager" containerID="cri-o://7e878b7ac1a94791722310700db59b2afda440870dd1e4bec364f135c03d7d6a" gracePeriod=30 Mar 16 00:11:27 crc kubenswrapper[4983]: I0316 00:11:27.176108 4983 generic.go:334] "Generic (PLEG): container finished" podID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" containerID="7e878b7ac1a94791722310700db59b2afda440870dd1e4bec364f135c03d7d6a" exitCode=0 Mar 16 00:11:27 crc kubenswrapper[4983]: I0316 00:11:27.176280 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" event={"ID":"4abd0e95-c153-4402-b4c0-447e8df8ef5e","Type":"ContainerDied","Data":"7e878b7ac1a94791722310700db59b2afda440870dd1e4bec364f135c03d7d6a"} Mar 16 00:11:27 crc kubenswrapper[4983]: I0316 00:11:27.178370 4983 generic.go:334] "Generic (PLEG): container finished" podID="9820571e-90e5-4a57-925f-6dee047d6c9d" containerID="29ae933b927660608a2257d29f2db898ec60860e11279ee76b7c688737efe62a" exitCode=0 Mar 16 00:11:27 crc kubenswrapper[4983]: I0316 00:11:27.178395 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" event={"ID":"9820571e-90e5-4a57-925f-6dee047d6c9d","Type":"ContainerDied","Data":"29ae933b927660608a2257d29f2db898ec60860e11279ee76b7c688737efe62a"} Mar 16 00:11:28 crc kubenswrapper[4983]: I0316 00:11:28.018176 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:11:28 crc kubenswrapper[4983]: I0316 00:11:28.018238 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:11:28 crc kubenswrapper[4983]: I0316 00:11:28.051894 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:11:28 crc kubenswrapper[4983]: I0316 00:11:28.223357 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:11:28 crc kubenswrapper[4983]: I0316 00:11:28.510821 4983 patch_prober.go:28] interesting pod/controller-manager-678b5dcc8b-f4ncq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Mar 16 00:11:28 crc kubenswrapper[4983]: I0316 00:11:28.510874 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" podUID="9820571e-90e5-4a57-925f-6dee047d6c9d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Mar 16 00:11:30 crc kubenswrapper[4983]: I0316 00:11:30.447442 4983 patch_prober.go:28] interesting pod/route-controller-manager-54dd5cd958-f2lqt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Mar 16 00:11:30 crc kubenswrapper[4983]: I0316 00:11:30.447507 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" podUID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.883850 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.917341 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv"] Mar 16 00:11:33 crc kubenswrapper[4983]: E0316 00:11:33.917937 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" containerName="oc" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.917954 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" containerName="oc" Mar 16 00:11:33 crc kubenswrapper[4983]: E0316 00:11:33.917999 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da42bf3-da76-4db7-9653-f2f08567084f" containerName="oc" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918008 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da42bf3-da76-4db7-9653-f2f08567084f" containerName="oc" Mar 16 00:11:33 crc kubenswrapper[4983]: E0316 00:11:33.918021 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93405e1-68b8-43ab-9628-cfd937aeca3f" containerName="pruner" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918029 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93405e1-68b8-43ab-9628-cfd937aeca3f" containerName="pruner" Mar 16 00:11:33 crc kubenswrapper[4983]: E0316 00:11:33.918043 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" containerName="route-controller-manager" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918079 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" containerName="route-controller-manager" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918265 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da42bf3-da76-4db7-9653-f2f08567084f" containerName="oc" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918281 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="b93405e1-68b8-43ab-9628-cfd937aeca3f" containerName="pruner" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918290 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" containerName="oc" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918328 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" containerName="route-controller-manager" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.919056 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.921117 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv"] Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.038499 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-client-ca\") pod \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.038687 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwzcp\" (UniqueName: \"kubernetes.io/projected/4abd0e95-c153-4402-b4c0-447e8df8ef5e-kube-api-access-fwzcp\") pod \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.038892 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4abd0e95-c153-4402-b4c0-447e8df8ef5e-serving-cert\") pod \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.038954 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-config\") pod \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039287 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-client-ca" (OuterVolumeSpecName: "client-ca") pod "4abd0e95-c153-4402-b4c0-447e8df8ef5e" (UID: "4abd0e95-c153-4402-b4c0-447e8df8ef5e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039450 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm24z\" (UniqueName: \"kubernetes.io/projected/d9cb240d-7329-47d5-89bd-d03b287f52c8-kube-api-access-bm24z\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039562 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-config\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039604 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9cb240d-7329-47d5-89bd-d03b287f52c8-serving-cert\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039631 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-client-ca\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039726 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039747 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-config" (OuterVolumeSpecName: "config") pod "4abd0e95-c153-4402-b4c0-447e8df8ef5e" (UID: "4abd0e95-c153-4402-b4c0-447e8df8ef5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.044160 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4abd0e95-c153-4402-b4c0-447e8df8ef5e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4abd0e95-c153-4402-b4c0-447e8df8ef5e" (UID: "4abd0e95-c153-4402-b4c0-447e8df8ef5e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.044855 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4abd0e95-c153-4402-b4c0-447e8df8ef5e-kube-api-access-fwzcp" (OuterVolumeSpecName: "kube-api-access-fwzcp") pod "4abd0e95-c153-4402-b4c0-447e8df8ef5e" (UID: "4abd0e95-c153-4402-b4c0-447e8df8ef5e"). InnerVolumeSpecName "kube-api-access-fwzcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.141736 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm24z\" (UniqueName: \"kubernetes.io/projected/d9cb240d-7329-47d5-89bd-d03b287f52c8-kube-api-access-bm24z\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.141842 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-config\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.141863 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9cb240d-7329-47d5-89bd-d03b287f52c8-serving-cert\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.141881 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-client-ca\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.142000 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwzcp\" (UniqueName: \"kubernetes.io/projected/4abd0e95-c153-4402-b4c0-447e8df8ef5e-kube-api-access-fwzcp\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.142018 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4abd0e95-c153-4402-b4c0-447e8df8ef5e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.142030 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.143543 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-client-ca\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.143606 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-config\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.146210 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9cb240d-7329-47d5-89bd-d03b287f52c8-serving-cert\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.170601 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm24z\" (UniqueName: \"kubernetes.io/projected/d9cb240d-7329-47d5-89bd-d03b287f52c8-kube-api-access-bm24z\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.233094 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.233119 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" event={"ID":"4abd0e95-c153-4402-b4c0-447e8df8ef5e","Type":"ContainerDied","Data":"ce5fe3e5b0b512c532291202764fa14d31963aa82f8479c829465bb5e42e3a4b"} Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.233183 4983 scope.go:117] "RemoveContainer" containerID="7e878b7ac1a94791722310700db59b2afda440870dd1e4bec364f135c03d7d6a" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.233254 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.257721 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt"] Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.261618 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt"] Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.106350 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" path="/var/lib/kubelet/pods/4abd0e95-c153-4402-b4c0-447e8df8ef5e/volumes" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.245619 4983 generic.go:334] "Generic (PLEG): container finished" podID="9f5bd50b-b197-4deb-ac50-768e3baa6cff" containerID="b9e245e332a00fe31e8a513f16d938a911b68f20bd84b7aa4a069280729c1f31" exitCode=0 Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.245665 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-9tclx" event={"ID":"9f5bd50b-b197-4deb-ac50-768e3baa6cff","Type":"ContainerDied","Data":"b9e245e332a00fe31e8a513f16d938a911b68f20bd84b7aa4a069280729c1f31"} Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.384838 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.410477 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b8585898d-qb964"] Mar 16 00:11:36 crc kubenswrapper[4983]: E0316 00:11:36.410723 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9820571e-90e5-4a57-925f-6dee047d6c9d" containerName="controller-manager" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.410734 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="9820571e-90e5-4a57-925f-6dee047d6c9d" containerName="controller-manager" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.410847 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="9820571e-90e5-4a57-925f-6dee047d6c9d" containerName="controller-manager" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.411187 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.428772 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b8585898d-qb964"] Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.468917 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-config\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.468965 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b4adba1-ea9b-4255-9ae7-c311268a26f2-serving-cert\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.468992 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-proxy-ca-bundles\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.469103 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk2gp\" (UniqueName: \"kubernetes.io/projected/9b4adba1-ea9b-4255-9ae7-c311268a26f2-kube-api-access-sk2gp\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.469148 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-client-ca\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570069 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lqdt\" (UniqueName: \"kubernetes.io/projected/9820571e-90e5-4a57-925f-6dee047d6c9d-kube-api-access-7lqdt\") pod \"9820571e-90e5-4a57-925f-6dee047d6c9d\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570128 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-client-ca\") pod \"9820571e-90e5-4a57-925f-6dee047d6c9d\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570148 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9820571e-90e5-4a57-925f-6dee047d6c9d-serving-cert\") pod \"9820571e-90e5-4a57-925f-6dee047d6c9d\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570229 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-config\") pod \"9820571e-90e5-4a57-925f-6dee047d6c9d\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570253 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-proxy-ca-bundles\") pod \"9820571e-90e5-4a57-925f-6dee047d6c9d\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570375 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-config\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570400 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b4adba1-ea9b-4255-9ae7-c311268a26f2-serving-cert\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570423 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-proxy-ca-bundles\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570466 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk2gp\" (UniqueName: \"kubernetes.io/projected/9b4adba1-ea9b-4255-9ae7-c311268a26f2-kube-api-access-sk2gp\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570512 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-client-ca\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.571210 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9820571e-90e5-4a57-925f-6dee047d6c9d" (UID: "9820571e-90e5-4a57-925f-6dee047d6c9d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.571491 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-client-ca\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.571615 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-config" (OuterVolumeSpecName: "config") pod "9820571e-90e5-4a57-925f-6dee047d6c9d" (UID: "9820571e-90e5-4a57-925f-6dee047d6c9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.571719 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-config\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.571801 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-client-ca" (OuterVolumeSpecName: "client-ca") pod "9820571e-90e5-4a57-925f-6dee047d6c9d" (UID: "9820571e-90e5-4a57-925f-6dee047d6c9d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.571986 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-proxy-ca-bundles\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.574362 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9820571e-90e5-4a57-925f-6dee047d6c9d-kube-api-access-7lqdt" (OuterVolumeSpecName: "kube-api-access-7lqdt") pod "9820571e-90e5-4a57-925f-6dee047d6c9d" (UID: "9820571e-90e5-4a57-925f-6dee047d6c9d"). InnerVolumeSpecName "kube-api-access-7lqdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.574385 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9820571e-90e5-4a57-925f-6dee047d6c9d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9820571e-90e5-4a57-925f-6dee047d6c9d" (UID: "9820571e-90e5-4a57-925f-6dee047d6c9d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.577407 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b4adba1-ea9b-4255-9ae7-c311268a26f2-serving-cert\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.589604 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk2gp\" (UniqueName: \"kubernetes.io/projected/9b4adba1-ea9b-4255-9ae7-c311268a26f2-kube-api-access-sk2gp\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.671695 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.671797 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.671835 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lqdt\" (UniqueName: \"kubernetes.io/projected/9820571e-90e5-4a57-925f-6dee047d6c9d-kube-api-access-7lqdt\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.671859 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.671882 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9820571e-90e5-4a57-925f-6dee047d6c9d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.725534 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:37 crc kubenswrapper[4983]: I0316 00:11:37.255455 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" event={"ID":"9820571e-90e5-4a57-925f-6dee047d6c9d","Type":"ContainerDied","Data":"0f80f8d77d5e7d52546396d855090da3e1caada3b0ec5bf14e1494148b61e7ef"} Mar 16 00:11:37 crc kubenswrapper[4983]: I0316 00:11:37.256992 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:11:37 crc kubenswrapper[4983]: I0316 00:11:37.320293 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq"] Mar 16 00:11:37 crc kubenswrapper[4983]: I0316 00:11:37.324992 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq"] Mar 16 00:11:37 crc kubenswrapper[4983]: I0316 00:11:37.760596 4983 scope.go:117] "RemoveContainer" containerID="29ae933b927660608a2257d29f2db898ec60860e11279ee76b7c688737efe62a" Mar 16 00:11:37 crc kubenswrapper[4983]: I0316 00:11:37.998980 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.101409 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9820571e-90e5-4a57-925f-6dee047d6c9d" path="/var/lib/kubelet/pods/9820571e-90e5-4a57-925f-6dee047d6c9d/volumes" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.194199 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w74t6\" (UniqueName: \"kubernetes.io/projected/9f5bd50b-b197-4deb-ac50-768e3baa6cff-kube-api-access-w74t6\") pod \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.194739 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9f5bd50b-b197-4deb-ac50-768e3baa6cff-serviceca\") pod \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.196200 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5bd50b-b197-4deb-ac50-768e3baa6cff-serviceca" (OuterVolumeSpecName: "serviceca") pod "9f5bd50b-b197-4deb-ac50-768e3baa6cff" (UID: "9f5bd50b-b197-4deb-ac50-768e3baa6cff"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.205533 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5bd50b-b197-4deb-ac50-768e3baa6cff-kube-api-access-w74t6" (OuterVolumeSpecName: "kube-api-access-w74t6") pod "9f5bd50b-b197-4deb-ac50-768e3baa6cff" (UID: "9f5bd50b-b197-4deb-ac50-768e3baa6cff"). InnerVolumeSpecName "kube-api-access-w74t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.266627 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-9tclx" event={"ID":"9f5bd50b-b197-4deb-ac50-768e3baa6cff","Type":"ContainerDied","Data":"de3be632dc1110954b83a945d8663b11e01d0d75623e9f6802f42d930bdec5ce"} Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.266692 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de3be632dc1110954b83a945d8663b11e01d0d75623e9f6802f42d930bdec5ce" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.266647 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.295997 4983 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9f5bd50b-b197-4deb-ac50-768e3baa6cff-serviceca\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.296040 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w74t6\" (UniqueName: \"kubernetes.io/projected/9f5bd50b-b197-4deb-ac50-768e3baa6cff-kube-api-access-w74t6\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:42 crc kubenswrapper[4983]: I0316 00:11:42.591168 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv"] Mar 16 00:11:42 crc kubenswrapper[4983]: W0316 00:11:42.604501 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9cb240d_7329_47d5_89bd_d03b287f52c8.slice/crio-41ca8bb8ee852710cef23cf60c78041ce97f661d4205df66bc614cb80ff3bc4b WatchSource:0}: Error finding container 41ca8bb8ee852710cef23cf60c78041ce97f661d4205df66bc614cb80ff3bc4b: Status 404 returned error can't find the container with id 41ca8bb8ee852710cef23cf60c78041ce97f661d4205df66bc614cb80ff3bc4b Mar 16 00:11:42 crc kubenswrapper[4983]: I0316 00:11:42.659926 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b8585898d-qb964"] Mar 16 00:11:42 crc kubenswrapper[4983]: W0316 00:11:42.661985 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b4adba1_ea9b_4255_9ae7_c311268a26f2.slice/crio-b87435187025d2237e0f17ddb78e0ba9a95e95ca7aa15ba629de1b087d9ffdef WatchSource:0}: Error finding container b87435187025d2237e0f17ddb78e0ba9a95e95ca7aa15ba629de1b087d9ffdef: Status 404 returned error can't find the container with id b87435187025d2237e0f17ddb78e0ba9a95e95ca7aa15ba629de1b087d9ffdef Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.306520 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" event={"ID":"d9cb240d-7329-47d5-89bd-d03b287f52c8","Type":"ContainerStarted","Data":"3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.306560 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" event={"ID":"d9cb240d-7329-47d5-89bd-d03b287f52c8","Type":"ContainerStarted","Data":"41ca8bb8ee852710cef23cf60c78041ce97f661d4205df66bc614cb80ff3bc4b"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.308588 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerStarted","Data":"5315d03c3a0c66cd9452cd1be2631735c8666c6ac21135b6c44ab5b65cd08883"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.310077 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" event={"ID":"9b4adba1-ea9b-4255-9ae7-c311268a26f2","Type":"ContainerStarted","Data":"a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.310106 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" event={"ID":"9b4adba1-ea9b-4255-9ae7-c311268a26f2","Type":"ContainerStarted","Data":"b87435187025d2237e0f17ddb78e0ba9a95e95ca7aa15ba629de1b087d9ffdef"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.312128 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerStarted","Data":"670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.314945 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerStarted","Data":"840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.316514 4983 generic.go:334] "Generic (PLEG): container finished" podID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerID="ff00e7152e69c4aeaaff4ebd02f8e9bc3011a8b0e33817b723307cd7fa5fe455" exitCode=0 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.316575 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv5g7" event={"ID":"b6bd9bf5-fa59-4fef-9589-7b5865098bd2","Type":"ContainerDied","Data":"ff00e7152e69c4aeaaff4ebd02f8e9bc3011a8b0e33817b723307cd7fa5fe455"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.318145 4983 generic.go:334] "Generic (PLEG): container finished" podID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerID="de0cee5fa65ae8acc06500ed4f7bfd1b7fc45fe51327cba7b49afb9439e0134f" exitCode=0 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.318252 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerDied","Data":"de0cee5fa65ae8acc06500ed4f7bfd1b7fc45fe51327cba7b49afb9439e0134f"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.358778 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-txzqn" podStartSLOduration=7.107563598 podStartE2EDuration="1m38.358740815s" podCreationTimestamp="2026-03-16 00:10:05 +0000 UTC" firstStartedPulling="2026-03-16 00:10:08.469938906 +0000 UTC m=+217.070037336" lastFinishedPulling="2026-03-16 00:11:39.721116103 +0000 UTC m=+308.321214553" observedRunningTime="2026-03-16 00:11:43.356244611 +0000 UTC m=+311.956343051" watchObservedRunningTime="2026-03-16 00:11:43.358740815 +0000 UTC m=+311.958839235" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.942838 4983 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.943126 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5bd50b-b197-4deb-ac50-768e3baa6cff" containerName="image-pruner" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.943141 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5bd50b-b197-4deb-ac50-768e3baa6cff" containerName="image-pruner" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.943272 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5bd50b-b197-4deb-ac50-768e3baa6cff" containerName="image-pruner" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.943685 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.943949 4983 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.944273 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc" gracePeriod=15 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.944307 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252" gracePeriod=15 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.944326 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9" gracePeriod=15 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.944339 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c" gracePeriod=15 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.944343 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4" gracePeriod=15 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.945953 4983 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946224 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946301 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946331 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946343 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946358 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946371 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946385 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946397 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946436 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946447 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946464 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946475 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946495 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946504 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946521 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946529 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946653 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946733 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946862 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946875 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946885 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946895 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946908 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.947142 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.947172 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.947185 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.947193 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.947310 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.947330 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961635 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961696 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961781 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961826 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961855 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961940 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961967 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961988 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.984262 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067354 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067399 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067428 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067455 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067475 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067510 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067498 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067529 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067550 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067576 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067616 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067621 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067645 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067660 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067683 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067688 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: E0316 00:11:44.234614 4983 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podfd0d91b2_07e6_4d69_ba2d_a1abde0ff1ef.slice/crio-9fde2814949dd21f55871ee57d9c0de0a132a8749d55cb58695f078937d0a417.scope\": RecentStats: unable to find data in memory cache]" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.267737 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: W0316 00:11:44.310728 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-1d34355322e05124292f9760bffc220151993c2d988dff3475114f2956c7f7b0 WatchSource:0}: Error finding container 1d34355322e05124292f9760bffc220151993c2d988dff3475114f2956c7f7b0: Status 404 returned error can't find the container with id 1d34355322e05124292f9760bffc220151993c2d988dff3475114f2956c7f7b0 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.323671 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1d34355322e05124292f9760bffc220151993c2d988dff3475114f2956c7f7b0"} Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.325696 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.326859 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.327418 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4" exitCode=0 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.327452 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9" exitCode=0 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.327465 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252" exitCode=0 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.327474 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c" exitCode=2 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.327490 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.329190 4983 generic.go:334] "Generic (PLEG): container finished" podID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerID="5315d03c3a0c66cd9452cd1be2631735c8666c6ac21135b6c44ab5b65cd08883" exitCode=0 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.329214 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerDied","Data":"5315d03c3a0c66cd9452cd1be2631735c8666c6ac21135b6c44ab5b65cd08883"} Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.330243 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.330492 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.330525 4983 generic.go:334] "Generic (PLEG): container finished" podID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" containerID="9fde2814949dd21f55871ee57d9c0de0a132a8749d55cb58695f078937d0a417" exitCode=0 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.331163 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef","Type":"ContainerDied","Data":"9fde2814949dd21f55871ee57d9c0de0a132a8749d55cb58695f078937d0a417"} Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.331318 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.331581 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.331624 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.331705 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.331863 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.332260 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.332843 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.333110 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.333381 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.333922 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.334222 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.334476 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.338510 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.338801 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.339041 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.339389 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.339678 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.340043 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.340783 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.341057 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.341412 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.341688 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.343303 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.343561 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.343861 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.344119 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: E0316 00:11:44.456313 4983 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d29e7f39b8734 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:11:44.455083828 +0000 UTC m=+313.055182258,LastTimestamp:2026-03-16 00:11:44.455083828 +0000 UTC m=+313.055182258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.338058 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerStarted","Data":"41a3bc85f34bbfc428451e248ff3adf551bec5d147a73350a6adc1cc78464c00"} Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.339378 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6"} Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.340402 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.340809 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.341092 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.341320 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.341584 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.341699 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv5g7" event={"ID":"b6bd9bf5-fa59-4fef-9589-7b5865098bd2","Type":"ContainerStarted","Data":"4e33c51822af1207a714633dbe36f0cfbe87f71ecdd87a6317177017c49cda72"} Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.341873 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.342345 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.342528 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.342794 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.343119 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.343402 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.343671 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.343950 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerStarted","Data":"ac7cc066be48efae22a5403ae6443b38e54eef1955947eb62836ee914ccddbe0"} Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.344077 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.344534 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.344853 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.345095 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.345376 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.345638 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.345938 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.346222 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.346403 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.346547 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.626674 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.627431 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.627897 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.628360 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.628662 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.628928 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.629266 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.629521 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.629802 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.642416 4983 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.642921 4983 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.643132 4983 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.643292 4983 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.643443 4983 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.643468 4983 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.643602 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.687526 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kube-api-access\") pod \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.687609 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kubelet-dir\") pod \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.687633 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-var-lock\") pod \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.687832 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-var-lock" (OuterVolumeSpecName: "var-lock") pod "fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" (UID: "fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.687870 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" (UID: "fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.695970 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" (UID: "fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.708460 4983 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d29e7f39b8734 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:11:44.455083828 +0000 UTC m=+313.055182258,LastTimestamp:2026-03-16 00:11:44.455083828 +0000 UTC m=+313.055182258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.789020 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.789053 4983 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.789061 4983 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-var-lock\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.844990 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.006132 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.006175 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.232878 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.233172 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:11:46 crc kubenswrapper[4983]: E0316 00:11:46.246545 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.274374 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.274975 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.275470 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.275936 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.276208 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.276463 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.276774 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.277093 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.277368 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.277659 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.354092 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.354731 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc" exitCode=0 Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.355989 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.355999 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef","Type":"ContainerDied","Data":"bb9e1da16a29be893a6a9d10f13e6b9a3bf25b7bf35da6d8f078d76e4ab8219e"} Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.356048 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb9e1da16a29be893a6a9d10f13e6b9a3bf25b7bf35da6d8f078d76e4ab8219e" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.357722 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.358058 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.358246 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.358442 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.358628 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.358831 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.359019 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.359383 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.359574 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.399071 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.399267 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.399433 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.399600 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.399797 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.399978 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.400160 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.400335 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.400517 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.758972 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.759072 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.803702 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.804267 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.804428 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.804696 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.805163 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.805407 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.805556 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.805698 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.805855 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.806193 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.851743 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.852814 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.853633 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.853937 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.854204 4983 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.854440 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.854656 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.854898 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.855116 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.855313 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.855597 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.855932 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020071 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020236 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020307 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020397 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020441 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020469 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020519 4983 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020535 4983 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020547 4983 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.047587 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.086686 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hsgsl" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="registry-server" probeResult="failure" output=< Mar 16 00:11:47 crc kubenswrapper[4983]: timeout: failed to connect service ":50051" within 1s Mar 16 00:11:47 crc kubenswrapper[4983]: > Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.366034 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.366898 4983 scope.go:117] "RemoveContainer" containerID="dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.367001 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.381008 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.382044 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.382715 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.383011 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.383348 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.383742 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.384085 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.385234 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.386140 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.386192 4983 scope.go:117] "RemoveContainer" containerID="a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.386416 4983 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.400172 4983 scope.go:117] "RemoveContainer" containerID="53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.414803 4983 scope.go:117] "RemoveContainer" containerID="64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.429993 4983 scope.go:117] "RemoveContainer" containerID="094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.433464 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:47Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:47Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:47Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:47Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[],\\\"sizeBytes\\\":1250173141},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:3008c2e1161979da3569238dfcb92458c7bf2cfa54386b63c466812c99ff2497\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:3d71bca7600fcb53c6999817cf27c91d0a308793cafa2c95f1cae2bb7bee6f57\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221752025},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.433909 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.434120 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.434312 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.434502 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.434526 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.454591 4983 scope.go:117] "RemoveContainer" containerID="3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.105268 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.389423 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.389488 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.450990 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.451733 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.452527 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.453483 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.454133 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.454611 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.455159 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.455962 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.456857 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.457448 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: E0316 00:11:48.648862 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="3.2s" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.400566 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.401099 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.427741 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.428440 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.428931 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.429580 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.429874 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.430104 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.430389 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.430743 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.431086 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.431353 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:50 crc kubenswrapper[4983]: I0316 00:11:50.450511 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7qx9g" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="registry-server" probeResult="failure" output=< Mar 16 00:11:50 crc kubenswrapper[4983]: timeout: failed to connect service ":50051" within 1s Mar 16 00:11:50 crc kubenswrapper[4983]: > Mar 16 00:11:51 crc kubenswrapper[4983]: E0316 00:11:51.850230 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="6.4s" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.094519 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.095178 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.095625 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.095974 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.096353 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.096717 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.097161 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.097384 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.097686 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:55 crc kubenswrapper[4983]: E0316 00:11:55.709705 4983 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d29e7f39b8734 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:11:44.455083828 +0000 UTC m=+313.055182258,LastTimestamp:2026-03-16 00:11:44.455083828 +0000 UTC m=+313.055182258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.042867 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.043799 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.044078 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.044317 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.044670 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.045263 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.045505 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.045826 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.046131 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.046350 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.079430 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.080122 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.080534 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.080945 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.081213 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.081514 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.082081 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.082479 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.082873 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.083047 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.092141 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.092856 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.093130 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.093304 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.093455 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.093614 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.093782 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.093940 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.094093 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.094246 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.117100 4983 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.117138 4983 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:11:56 crc kubenswrapper[4983]: E0316 00:11:56.117494 4983 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.117978 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.281779 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.282921 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283068 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283209 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283341 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283486 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283623 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283769 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283926 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.284058 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.419201 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ac719b6d5974cdb29679bd18f99beb6f7ded826cf7f9bf7bcf8117d7fb98dc1b"} Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.796054 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.796566 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.796977 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.797235 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.797542 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.797785 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.798019 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.798264 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.798514 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.798767 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.427258 4983 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3862dea5b181d58336a593f482a4ce66f4ee8e743ace00ca62e0c6bde1865a68" exitCode=0 Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.427301 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3862dea5b181d58336a593f482a4ce66f4ee8e743ace00ca62e0c6bde1865a68"} Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.427591 4983 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.427822 4983 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.428291 4983 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.428402 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.428999 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.429298 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.429628 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.429970 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.430394 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.430655 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.431067 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.431999 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.477200 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:57Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:57Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:57Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:57Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[],\\\"sizeBytes\\\":1250173141},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:3008c2e1161979da3569238dfcb92458c7bf2cfa54386b63c466812c99ff2497\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:3d71bca7600fcb53c6999817cf27c91d0a308793cafa2c95f1cae2bb7bee6f57\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221752025},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.477856 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.478276 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.478633 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.479195 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.479237 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.122485 4983 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.122553 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.438935 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.441171 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.441251 4983 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a" exitCode=1 Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.441340 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a"} Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.442156 4983 scope.go:117] "RemoveContainer" containerID="840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a" Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.444651 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ead20f7bc8379f8c39f8f7828f1345adb9aaee42fa3767a1c38ff4c1c773f1fb"} Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.437382 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.453374 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.453974 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.454125 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3821a4b13ed2ef6f61a0d42fd64d93cd5287b1aa8d8d74d76819753dc4a0c27e"} Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458394 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"22c13c0d9a35f0019afb4fe3c13d6f547188e5cb08665c79a453887c9d732baf"} Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458441 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5a31a9d6ffebdf1c7824d2abc6cfa47c386acdef3bc78af32325e775357ac172"} Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458455 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d98c834b2fc70b5f7985a629e8e583a7cd31e26523ec30d6130d9808bb0ea915"} Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458468 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a73adcf069e28dc6fb6a72e0659819c919b86830014a3543d4d382310d6b3334"} Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458800 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458910 4983 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458983 4983 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.475459 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:12:01 crc kubenswrapper[4983]: I0316 00:12:01.119569 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:01 crc kubenswrapper[4983]: I0316 00:12:01.119869 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:01 crc kubenswrapper[4983]: I0316 00:12:01.126077 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:03 crc kubenswrapper[4983]: I0316 00:12:03.707947 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:03 crc kubenswrapper[4983]: I0316 00:12:03.713648 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:04 crc kubenswrapper[4983]: I0316 00:12:04.485521 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:04 crc kubenswrapper[4983]: I0316 00:12:04.757658 4983 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:04 crc kubenswrapper[4983]: I0316 00:12:04.989387 4983 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="85848ab8-d516-4d81-82d6-469eb041be6d" Mar 16 00:12:05 crc kubenswrapper[4983]: I0316 00:12:05.498476 4983 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:12:05 crc kubenswrapper[4983]: I0316 00:12:05.498830 4983 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:12:05 crc kubenswrapper[4983]: I0316 00:12:05.501247 4983 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="85848ab8-d516-4d81-82d6-469eb041be6d" Mar 16 00:12:14 crc kubenswrapper[4983]: I0316 00:12:14.533744 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 16 00:12:15 crc kubenswrapper[4983]: I0316 00:12:15.700791 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 16 00:12:15 crc kubenswrapper[4983]: I0316 00:12:15.915655 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.024997 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.073939 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.141805 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.144915 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.251786 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.318108 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.457217 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.457896 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.567442 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.570886 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.673015 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.753024 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.906907 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.070778 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.189545 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.198304 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.341370 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.346534 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.502858 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.548162 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.638307 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.650812 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.750926 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.755549 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.987901 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.013206 4983 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.059636 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.080076 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.080775 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.193765 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.260813 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.320149 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.420002 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.518765 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.567135 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.612607 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.642245 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.646672 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.648103 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.668963 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.720651 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.908418 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.941068 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.957860 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.110558 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.116675 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.159509 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.191302 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.201032 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.267243 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.302653 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.324104 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.385136 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.557376 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.612543 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.665425 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.780498 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.800291 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.826814 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.988911 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.993991 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.004685 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.043981 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.083345 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.122365 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.146825 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.162919 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.187515 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.198860 4983 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.225846 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.235341 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.247329 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.250848 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.251217 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.294232 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.388055 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.537887 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.548632 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.571915 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.651727 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.683098 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.759527 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.771411 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.785223 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.847499 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.856161 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.895961 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.932902 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.997650 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.057256 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.066211 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.163482 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.168551 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.256898 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.312628 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.316453 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.336784 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.338691 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.341307 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.554561 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.574173 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.609878 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.637591 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.749853 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.757745 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.816040 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.817082 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.920027 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.013820 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.020370 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.107029 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.115025 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.153708 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.284190 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.298856 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.502074 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.545622 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.624605 4983 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.650421 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.698409 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.709962 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.714212 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.745505 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.745708 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.746378 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.776175 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.875846 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.961326 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.965586 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.131067 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.231179 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.393490 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.409951 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.411139 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.455506 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.463745 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.687400 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.707091 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.738820 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.751671 4983 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.753399 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7qx9g" podStartSLOduration=45.522868203 podStartE2EDuration="2m14.753382621s" podCreationTimestamp="2026-03-16 00:10:09 +0000 UTC" firstStartedPulling="2026-03-16 00:10:15.872464838 +0000 UTC m=+224.472563268" lastFinishedPulling="2026-03-16 00:11:45.102979236 +0000 UTC m=+313.703077686" observedRunningTime="2026-03-16 00:12:04.85359295 +0000 UTC m=+333.453691380" watchObservedRunningTime="2026-03-16 00:12:23.753382621 +0000 UTC m=+352.353481071" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.753690 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" podStartSLOduration=57.753684959 podStartE2EDuration="57.753684959s" podCreationTimestamp="2026-03-16 00:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:04.802433092 +0000 UTC m=+333.402531522" watchObservedRunningTime="2026-03-16 00:12:23.753684959 +0000 UTC m=+352.353783389" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.754555 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=40.754548422 podStartE2EDuration="40.754548422s" podCreationTimestamp="2026-03-16 00:11:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:04.869174349 +0000 UTC m=+333.469272779" watchObservedRunningTime="2026-03-16 00:12:23.754548422 +0000 UTC m=+352.354646852" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.754668 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" podStartSLOduration=58.754663285 podStartE2EDuration="58.754663285s" podCreationTimestamp="2026-03-16 00:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:04.915313621 +0000 UTC m=+333.515412051" watchObservedRunningTime="2026-03-16 00:12:23.754663285 +0000 UTC m=+352.354761715" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.755124 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kjc2w" podStartSLOduration=44.06910464 podStartE2EDuration="2m15.755118247s" podCreationTimestamp="2026-03-16 00:10:08 +0000 UTC" firstStartedPulling="2026-03-16 00:10:10.53564224 +0000 UTC m=+219.135740670" lastFinishedPulling="2026-03-16 00:11:42.221655847 +0000 UTC m=+310.821754277" observedRunningTime="2026-03-16 00:12:04.833630832 +0000 UTC m=+333.433729262" watchObservedRunningTime="2026-03-16 00:12:23.755118247 +0000 UTC m=+352.355216677" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.755645 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sv5g7" podStartSLOduration=41.526657801 podStartE2EDuration="2m17.755638401s" podCreationTimestamp="2026-03-16 00:10:06 +0000 UTC" firstStartedPulling="2026-03-16 00:10:08.437563419 +0000 UTC m=+217.037661849" lastFinishedPulling="2026-03-16 00:11:44.666544019 +0000 UTC m=+313.266642449" observedRunningTime="2026-03-16 00:12:04.783068281 +0000 UTC m=+333.383166751" watchObservedRunningTime="2026-03-16 00:12:23.755638401 +0000 UTC m=+352.355736831" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.756207 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hsgsl" podStartSLOduration=42.350967706 podStartE2EDuration="2m18.756198127s" podCreationTimestamp="2026-03-16 00:10:05 +0000 UTC" firstStartedPulling="2026-03-16 00:10:08.43759261 +0000 UTC m=+217.037691040" lastFinishedPulling="2026-03-16 00:11:44.842823031 +0000 UTC m=+313.442921461" observedRunningTime="2026-03-16 00:12:04.887748819 +0000 UTC m=+333.487847249" watchObservedRunningTime="2026-03-16 00:12:23.756198127 +0000 UTC m=+352.356296557" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.756598 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.756641 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.760715 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.773178 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.775037 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.779726 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.77971496 podStartE2EDuration="19.77971496s" podCreationTimestamp="2026-03-16 00:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:23.779014311 +0000 UTC m=+352.379112751" watchObservedRunningTime="2026-03-16 00:12:23.77971496 +0000 UTC m=+352.379813390" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.816124 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.855573 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.867328 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.941374 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.033510 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.127698 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.144468 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.222974 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.264998 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.308045 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.378238 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.444272 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.491197 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.612800 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.618010 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.633461 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.684659 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.698344 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560332-pflh5"] Mar 16 00:12:24 crc kubenswrapper[4983]: E0316 00:12:24.698629 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" containerName="installer" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.698653 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" containerName="installer" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.698802 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" containerName="installer" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.699242 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.701713 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.701713 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.702245 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.714614 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-pflh5"] Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.749043 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.765240 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.779918 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.821105 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.835888 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k76k7\" (UniqueName: \"kubernetes.io/projected/90d9cc10-08aa-485e-a7cd-305a3e316c39-kube-api-access-k76k7\") pod \"auto-csr-approver-29560332-pflh5\" (UID: \"90d9cc10-08aa-485e-a7cd-305a3e316c39\") " pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.883765 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.890257 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.904349 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.937873 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k76k7\" (UniqueName: \"kubernetes.io/projected/90d9cc10-08aa-485e-a7cd-305a3e316c39-kube-api-access-k76k7\") pod \"auto-csr-approver-29560332-pflh5\" (UID: \"90d9cc10-08aa-485e-a7cd-305a3e316c39\") " pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.961611 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k76k7\" (UniqueName: \"kubernetes.io/projected/90d9cc10-08aa-485e-a7cd-305a3e316c39-kube-api-access-k76k7\") pod \"auto-csr-approver-29560332-pflh5\" (UID: \"90d9cc10-08aa-485e-a7cd-305a3e316c39\") " pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.061649 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.123963 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.171576 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.202849 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.308773 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.341112 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.424109 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.480600 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.480854 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-pflh5"] Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.530084 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.554647 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.565212 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.612368 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560332-pflh5" event={"ID":"90d9cc10-08aa-485e-a7cd-305a3e316c39","Type":"ContainerStarted","Data":"dc6b9b2afc9fc030b727abc83bb90ecd98dd4e195ab2eb513f9fb76ca8eb3fc0"} Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.627490 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.661448 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.719318 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.764035 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.810743 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.911538 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b8585898d-qb964"] Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.911743 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" containerName="controller-manager" containerID="cri-o://a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef" gracePeriod=30 Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.933035 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.008604 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv"] Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.009064 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" containerName="route-controller-manager" containerID="cri-o://3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9" gracePeriod=30 Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.154385 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.209135 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.324247 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.373120 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.398340 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.406193 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.450185 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.453482 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-config\") pod \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.453536 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk2gp\" (UniqueName: \"kubernetes.io/projected/9b4adba1-ea9b-4255-9ae7-c311268a26f2-kube-api-access-sk2gp\") pod \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.453600 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b4adba1-ea9b-4255-9ae7-c311268a26f2-serving-cert\") pod \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.453644 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-client-ca\") pod \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.453715 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-proxy-ca-bundles\") pod \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.455050 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-client-ca" (OuterVolumeSpecName: "client-ca") pod "9b4adba1-ea9b-4255-9ae7-c311268a26f2" (UID: "9b4adba1-ea9b-4255-9ae7-c311268a26f2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.455104 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-config" (OuterVolumeSpecName: "config") pod "9b4adba1-ea9b-4255-9ae7-c311268a26f2" (UID: "9b4adba1-ea9b-4255-9ae7-c311268a26f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.456172 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9b4adba1-ea9b-4255-9ae7-c311268a26f2" (UID: "9b4adba1-ea9b-4255-9ae7-c311268a26f2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.460336 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4adba1-ea9b-4255-9ae7-c311268a26f2-kube-api-access-sk2gp" (OuterVolumeSpecName: "kube-api-access-sk2gp") pod "9b4adba1-ea9b-4255-9ae7-c311268a26f2" (UID: "9b4adba1-ea9b-4255-9ae7-c311268a26f2"). InnerVolumeSpecName "kube-api-access-sk2gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.461553 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4adba1-ea9b-4255-9ae7-c311268a26f2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9b4adba1-ea9b-4255-9ae7-c311268a26f2" (UID: "9b4adba1-ea9b-4255-9ae7-c311268a26f2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.478919 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.493689 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.531881 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554367 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm24z\" (UniqueName: \"kubernetes.io/projected/d9cb240d-7329-47d5-89bd-d03b287f52c8-kube-api-access-bm24z\") pod \"d9cb240d-7329-47d5-89bd-d03b287f52c8\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554404 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9cb240d-7329-47d5-89bd-d03b287f52c8-serving-cert\") pod \"d9cb240d-7329-47d5-89bd-d03b287f52c8\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554456 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-config\") pod \"d9cb240d-7329-47d5-89bd-d03b287f52c8\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554503 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-client-ca\") pod \"d9cb240d-7329-47d5-89bd-d03b287f52c8\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554729 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554742 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk2gp\" (UniqueName: \"kubernetes.io/projected/9b4adba1-ea9b-4255-9ae7-c311268a26f2-kube-api-access-sk2gp\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554767 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b4adba1-ea9b-4255-9ae7-c311268a26f2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554776 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554784 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.555452 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-client-ca" (OuterVolumeSpecName: "client-ca") pod "d9cb240d-7329-47d5-89bd-d03b287f52c8" (UID: "d9cb240d-7329-47d5-89bd-d03b287f52c8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.556097 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-config" (OuterVolumeSpecName: "config") pod "d9cb240d-7329-47d5-89bd-d03b287f52c8" (UID: "d9cb240d-7329-47d5-89bd-d03b287f52c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.557605 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9cb240d-7329-47d5-89bd-d03b287f52c8-kube-api-access-bm24z" (OuterVolumeSpecName: "kube-api-access-bm24z") pod "d9cb240d-7329-47d5-89bd-d03b287f52c8" (UID: "d9cb240d-7329-47d5-89bd-d03b287f52c8"). InnerVolumeSpecName "kube-api-access-bm24z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.557909 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9cb240d-7329-47d5-89bd-d03b287f52c8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d9cb240d-7329-47d5-89bd-d03b287f52c8" (UID: "d9cb240d-7329-47d5-89bd-d03b287f52c8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.619103 4983 generic.go:334] "Generic (PLEG): container finished" podID="d9cb240d-7329-47d5-89bd-d03b287f52c8" containerID="3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9" exitCode=0 Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.619182 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.619211 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" event={"ID":"d9cb240d-7329-47d5-89bd-d03b287f52c8","Type":"ContainerDied","Data":"3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9"} Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.619263 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" event={"ID":"d9cb240d-7329-47d5-89bd-d03b287f52c8","Type":"ContainerDied","Data":"41ca8bb8ee852710cef23cf60c78041ce97f661d4205df66bc614cb80ff3bc4b"} Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.619280 4983 scope.go:117] "RemoveContainer" containerID="3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.621888 4983 generic.go:334] "Generic (PLEG): container finished" podID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" containerID="a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef" exitCode=0 Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.621912 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" event={"ID":"9b4adba1-ea9b-4255-9ae7-c311268a26f2","Type":"ContainerDied","Data":"a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef"} Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.621948 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" event={"ID":"9b4adba1-ea9b-4255-9ae7-c311268a26f2","Type":"ContainerDied","Data":"b87435187025d2237e0f17ddb78e0ba9a95e95ca7aa15ba629de1b087d9ffdef"} Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.622019 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.636979 4983 scope.go:117] "RemoveContainer" containerID="3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9" Mar 16 00:12:26 crc kubenswrapper[4983]: E0316 00:12:26.637313 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9\": container with ID starting with 3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9 not found: ID does not exist" containerID="3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.637342 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9"} err="failed to get container status \"3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9\": rpc error: code = NotFound desc = could not find container \"3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9\": container with ID starting with 3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9 not found: ID does not exist" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.637360 4983 scope.go:117] "RemoveContainer" containerID="a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.647166 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.652030 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b8585898d-qb964"] Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.656048 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm24z\" (UniqueName: \"kubernetes.io/projected/d9cb240d-7329-47d5-89bd-d03b287f52c8-kube-api-access-bm24z\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.656083 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9cb240d-7329-47d5-89bd-d03b287f52c8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.656099 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.656112 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.658623 4983 scope.go:117] "RemoveContainer" containerID="a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef" Mar 16 00:12:26 crc kubenswrapper[4983]: E0316 00:12:26.659018 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef\": container with ID starting with a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef not found: ID does not exist" containerID="a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.659109 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef"} err="failed to get container status \"a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef\": rpc error: code = NotFound desc = could not find container \"a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef\": container with ID starting with a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef not found: ID does not exist" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.664460 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b8585898d-qb964"] Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.670717 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv"] Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.675933 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv"] Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.713717 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.758464 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.763891 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.766917 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.805248 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.817535 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.936628 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.943731 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.967071 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.978246 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.179808 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.281712 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.329779 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.399408 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.450946 4983 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.451206 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6" gracePeriod=5 Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.486707 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.631949 4983 generic.go:334] "Generic (PLEG): container finished" podID="90d9cc10-08aa-485e-a7cd-305a3e316c39" containerID="0e3f6e1e6221d6bd922f567a1feb21e97e8062170d3d8a1f33f38076de2dd3b8" exitCode=0 Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.632133 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560332-pflh5" event={"ID":"90d9cc10-08aa-485e-a7cd-305a3e316c39","Type":"ContainerDied","Data":"0e3f6e1e6221d6bd922f567a1feb21e97e8062170d3d8a1f33f38076de2dd3b8"} Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.644861 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.689430 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z"] Mar 16 00:12:27 crc kubenswrapper[4983]: E0316 00:12:27.689735 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" containerName="controller-manager" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.689773 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" containerName="controller-manager" Mar 16 00:12:27 crc kubenswrapper[4983]: E0316 00:12:27.689787 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" containerName="route-controller-manager" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.689797 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" containerName="route-controller-manager" Mar 16 00:12:27 crc kubenswrapper[4983]: E0316 00:12:27.689811 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.689819 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.689977 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" containerName="route-controller-manager" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.689995 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.690010 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" containerName="controller-manager" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.690475 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.692353 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.693053 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.693560 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.693742 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.694625 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.694952 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.696604 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp"] Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.697309 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.703671 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.703844 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.704048 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.704372 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.704727 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.704906 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.706515 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.710098 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp"] Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.712846 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z"] Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.713983 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769205 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-serving-cert\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769247 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba019949-b4c5-4df0-b625-32daf56cabec-serving-cert\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769267 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-config\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769300 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcpjg\" (UniqueName: \"kubernetes.io/projected/ba019949-b4c5-4df0-b625-32daf56cabec-kube-api-access-wcpjg\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769364 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-client-ca\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769395 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w248t\" (UniqueName: \"kubernetes.io/projected/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-kube-api-access-w248t\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769412 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-client-ca\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769435 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-proxy-ca-bundles\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769484 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-config\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.778493 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.799748 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870061 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-serving-cert\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870264 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba019949-b4c5-4df0-b625-32daf56cabec-serving-cert\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870440 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-config\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870514 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcpjg\" (UniqueName: \"kubernetes.io/projected/ba019949-b4c5-4df0-b625-32daf56cabec-kube-api-access-wcpjg\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870598 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-client-ca\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870689 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w248t\" (UniqueName: \"kubernetes.io/projected/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-kube-api-access-w248t\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870786 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-client-ca\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870887 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-proxy-ca-bundles\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870982 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-config\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.872313 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-config\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.872614 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-proxy-ca-bundles\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.872825 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-client-ca\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.872909 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-client-ca\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.873157 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-config\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.875242 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-serving-cert\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.877484 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba019949-b4c5-4df0-b625-32daf56cabec-serving-cert\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.908277 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w248t\" (UniqueName: \"kubernetes.io/projected/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-kube-api-access-w248t\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.910837 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcpjg\" (UniqueName: \"kubernetes.io/projected/ba019949-b4c5-4df0-b625-32daf56cabec-kube-api-access-wcpjg\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.945383 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.954046 4983 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.998284 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.014790 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.024530 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.100521 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" path="/var/lib/kubelet/pods/9b4adba1-ea9b-4255-9ae7-c311268a26f2/volumes" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.101297 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" path="/var/lib/kubelet/pods/d9cb240d-7329-47d5-89bd-d03b287f52c8/volumes" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.146151 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.170568 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.275793 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.316415 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.345803 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.398824 4983 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.402404 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.410557 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.435000 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.465484 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.490413 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp"] Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.494651 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z"] Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.639986 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" event={"ID":"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa","Type":"ContainerStarted","Data":"b9888d3c16b45058c14a5b2ded8f3ebdb76d157bd33693c09d4465398bc356f5"} Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.641661 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" event={"ID":"ba019949-b4c5-4df0-b625-32daf56cabec","Type":"ContainerStarted","Data":"a1d7dea939295229f9e48941df849399a44450bd71bed65af8d7bf28aa012cb3"} Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.856642 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.982620 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k76k7\" (UniqueName: \"kubernetes.io/projected/90d9cc10-08aa-485e-a7cd-305a3e316c39-kube-api-access-k76k7\") pod \"90d9cc10-08aa-485e-a7cd-305a3e316c39\" (UID: \"90d9cc10-08aa-485e-a7cd-305a3e316c39\") " Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.986973 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d9cc10-08aa-485e-a7cd-305a3e316c39-kube-api-access-k76k7" (OuterVolumeSpecName: "kube-api-access-k76k7") pod "90d9cc10-08aa-485e-a7cd-305a3e316c39" (UID: "90d9cc10-08aa-485e-a7cd-305a3e316c39"). InnerVolumeSpecName "kube-api-access-k76k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.046038 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.083645 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k76k7\" (UniqueName: \"kubernetes.io/projected/90d9cc10-08aa-485e-a7cd-305a3e316c39-kube-api-access-k76k7\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.196511 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.241833 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.296707 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.447065 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.483393 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.650749 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.650684 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560332-pflh5" event={"ID":"90d9cc10-08aa-485e-a7cd-305a3e316c39","Type":"ContainerDied","Data":"dc6b9b2afc9fc030b727abc83bb90ecd98dd4e195ab2eb513f9fb76ca8eb3fc0"} Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.651380 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc6b9b2afc9fc030b727abc83bb90ecd98dd4e195ab2eb513f9fb76ca8eb3fc0" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.652225 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" event={"ID":"ba019949-b4c5-4df0-b625-32daf56cabec","Type":"ContainerStarted","Data":"d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877"} Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.655246 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.657904 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" event={"ID":"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa","Type":"ContainerStarted","Data":"49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7"} Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.658904 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.660926 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.662939 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.670826 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" podStartSLOduration=4.670803316 podStartE2EDuration="4.670803316s" podCreationTimestamp="2026-03-16 00:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:29.668978877 +0000 UTC m=+358.269077307" watchObservedRunningTime="2026-03-16 00:12:29.670803316 +0000 UTC m=+358.270901776" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.714082 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" podStartSLOduration=3.714063391 podStartE2EDuration="3.714063391s" podCreationTimestamp="2026-03-16 00:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:29.712348695 +0000 UTC m=+358.312447125" watchObservedRunningTime="2026-03-16 00:12:29.714063391 +0000 UTC m=+358.314161831" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.796632 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.946822 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 16 00:12:30 crc kubenswrapper[4983]: I0316 00:12:30.050468 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 16 00:12:30 crc kubenswrapper[4983]: I0316 00:12:30.151957 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:12:30 crc kubenswrapper[4983]: I0316 00:12:30.291165 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 16 00:12:30 crc kubenswrapper[4983]: I0316 00:12:30.555337 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 16 00:12:30 crc kubenswrapper[4983]: I0316 00:12:30.670144 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.044519 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.077600 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.083013 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.194263 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.345134 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.513921 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.602071 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.703546 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.334564 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.421542 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.616153 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.616232 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.657956 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658000 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658014 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658048 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658072 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658120 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658159 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658193 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658234 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658586 4983 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658613 4983 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658632 4983 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658648 4983 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.664981 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.675876 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.675925 4983 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6" exitCode=137 Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.675969 4983 scope.go:117] "RemoveContainer" containerID="bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.676003 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.712304 4983 scope.go:117] "RemoveContainer" containerID="bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6" Mar 16 00:12:32 crc kubenswrapper[4983]: E0316 00:12:32.713119 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6\": container with ID starting with bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6 not found: ID does not exist" containerID="bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.713172 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6"} err="failed to get container status \"bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6\": rpc error: code = NotFound desc = could not find container \"bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6\": container with ID starting with bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6 not found: ID does not exist" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.759688 4983 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.844191 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 16 00:12:34 crc kubenswrapper[4983]: I0316 00:12:34.105595 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 16 00:12:34 crc kubenswrapper[4983]: I0316 00:12:34.106153 4983 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 16 00:12:34 crc kubenswrapper[4983]: I0316 00:12:34.122078 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:12:34 crc kubenswrapper[4983]: I0316 00:12:34.122111 4983 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5353c607-9ed4-4276-9367-bcd7087a8af4" Mar 16 00:12:34 crc kubenswrapper[4983]: I0316 00:12:34.130035 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:12:34 crc kubenswrapper[4983]: I0316 00:12:34.130107 4983 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5353c607-9ed4-4276-9367-bcd7087a8af4" Mar 16 00:12:45 crc kubenswrapper[4983]: I0316 00:12:45.905279 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp"] Mar 16 00:12:45 crc kubenswrapper[4983]: I0316 00:12:45.906027 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" podUID="ba019949-b4c5-4df0-b625-32daf56cabec" containerName="controller-manager" containerID="cri-o://d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877" gracePeriod=30 Mar 16 00:12:45 crc kubenswrapper[4983]: I0316 00:12:45.920787 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z"] Mar 16 00:12:45 crc kubenswrapper[4983]: I0316 00:12:45.920987 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" podUID="5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" containerName="route-controller-manager" containerID="cri-o://49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7" gracePeriod=30 Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.463286 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.513477 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543368 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-serving-cert\") pod \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543453 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w248t\" (UniqueName: \"kubernetes.io/projected/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-kube-api-access-w248t\") pod \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543478 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba019949-b4c5-4df0-b625-32daf56cabec-serving-cert\") pod \"ba019949-b4c5-4df0-b625-32daf56cabec\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543527 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcpjg\" (UniqueName: \"kubernetes.io/projected/ba019949-b4c5-4df0-b625-32daf56cabec-kube-api-access-wcpjg\") pod \"ba019949-b4c5-4df0-b625-32daf56cabec\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543558 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-proxy-ca-bundles\") pod \"ba019949-b4c5-4df0-b625-32daf56cabec\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543577 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-client-ca\") pod \"ba019949-b4c5-4df0-b625-32daf56cabec\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543591 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-config\") pod \"ba019949-b4c5-4df0-b625-32daf56cabec\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543620 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-client-ca\") pod \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543646 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-config\") pod \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.544951 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-client-ca" (OuterVolumeSpecName: "client-ca") pod "ba019949-b4c5-4df0-b625-32daf56cabec" (UID: "ba019949-b4c5-4df0-b625-32daf56cabec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.544952 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-client-ca" (OuterVolumeSpecName: "client-ca") pod "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" (UID: "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.545048 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ba019949-b4c5-4df0-b625-32daf56cabec" (UID: "ba019949-b4c5-4df0-b625-32daf56cabec"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.545576 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-config" (OuterVolumeSpecName: "config") pod "ba019949-b4c5-4df0-b625-32daf56cabec" (UID: "ba019949-b4c5-4df0-b625-32daf56cabec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.545624 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-config" (OuterVolumeSpecName: "config") pod "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" (UID: "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.549065 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" (UID: "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.549111 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba019949-b4c5-4df0-b625-32daf56cabec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ba019949-b4c5-4df0-b625-32daf56cabec" (UID: "ba019949-b4c5-4df0-b625-32daf56cabec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.549204 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba019949-b4c5-4df0-b625-32daf56cabec-kube-api-access-wcpjg" (OuterVolumeSpecName: "kube-api-access-wcpjg") pod "ba019949-b4c5-4df0-b625-32daf56cabec" (UID: "ba019949-b4c5-4df0-b625-32daf56cabec"). InnerVolumeSpecName "kube-api-access-wcpjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.549442 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-kube-api-access-w248t" (OuterVolumeSpecName: "kube-api-access-w248t") pod "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" (UID: "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa"). InnerVolumeSpecName "kube-api-access-w248t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645258 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w248t\" (UniqueName: \"kubernetes.io/projected/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-kube-api-access-w248t\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645293 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba019949-b4c5-4df0-b625-32daf56cabec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645309 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcpjg\" (UniqueName: \"kubernetes.io/projected/ba019949-b4c5-4df0-b625-32daf56cabec-kube-api-access-wcpjg\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645320 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645332 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645342 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645354 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645364 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645376 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.759073 4983 generic.go:334] "Generic (PLEG): container finished" podID="ba019949-b4c5-4df0-b625-32daf56cabec" containerID="d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877" exitCode=0 Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.759122 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" event={"ID":"ba019949-b4c5-4df0-b625-32daf56cabec","Type":"ContainerDied","Data":"d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877"} Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.759136 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.759161 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" event={"ID":"ba019949-b4c5-4df0-b625-32daf56cabec","Type":"ContainerDied","Data":"a1d7dea939295229f9e48941df849399a44450bd71bed65af8d7bf28aa012cb3"} Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.759181 4983 scope.go:117] "RemoveContainer" containerID="d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.765434 4983 generic.go:334] "Generic (PLEG): container finished" podID="5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" containerID="49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7" exitCode=0 Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.765725 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" event={"ID":"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa","Type":"ContainerDied","Data":"49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7"} Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.765900 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.765952 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" event={"ID":"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa","Type":"ContainerDied","Data":"b9888d3c16b45058c14a5b2ded8f3ebdb76d157bd33693c09d4465398bc356f5"} Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.781508 4983 scope.go:117] "RemoveContainer" containerID="d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877" Mar 16 00:12:46 crc kubenswrapper[4983]: E0316 00:12:46.785682 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877\": container with ID starting with d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877 not found: ID does not exist" containerID="d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.785889 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877"} err="failed to get container status \"d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877\": rpc error: code = NotFound desc = could not find container \"d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877\": container with ID starting with d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877 not found: ID does not exist" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.786164 4983 scope.go:117] "RemoveContainer" containerID="49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.789157 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp"] Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.795212 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp"] Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.809279 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z"] Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.811385 4983 scope.go:117] "RemoveContainer" containerID="49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7" Mar 16 00:12:46 crc kubenswrapper[4983]: E0316 00:12:46.811849 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7\": container with ID starting with 49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7 not found: ID does not exist" containerID="49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.811897 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7"} err="failed to get container status \"49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7\": rpc error: code = NotFound desc = could not find container \"49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7\": container with ID starting with 49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7 not found: ID does not exist" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.813647 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z"] Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703246 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78c886458b-c7whn"] Mar 16 00:12:47 crc kubenswrapper[4983]: E0316 00:12:47.703548 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba019949-b4c5-4df0-b625-32daf56cabec" containerName="controller-manager" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703564 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba019949-b4c5-4df0-b625-32daf56cabec" containerName="controller-manager" Mar 16 00:12:47 crc kubenswrapper[4983]: E0316 00:12:47.703576 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d9cc10-08aa-485e-a7cd-305a3e316c39" containerName="oc" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703586 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d9cc10-08aa-485e-a7cd-305a3e316c39" containerName="oc" Mar 16 00:12:47 crc kubenswrapper[4983]: E0316 00:12:47.703600 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" containerName="route-controller-manager" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703609 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" containerName="route-controller-manager" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703732 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba019949-b4c5-4df0-b625-32daf56cabec" containerName="controller-manager" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703748 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" containerName="route-controller-manager" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703782 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d9cc10-08aa-485e-a7cd-305a3e316c39" containerName="oc" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.704241 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.706116 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.706483 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.706906 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.706901 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.707472 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm"] Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.708123 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.708608 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.708692 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.710181 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.710407 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.710789 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.710787 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.711519 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.715040 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.715277 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.716314 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78c886458b-c7whn"] Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.726517 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm"] Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758189 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-config\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758243 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-client-ca\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758270 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7705ce2-6b0a-4204-857b-b80448d4b201-serving-cert\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758295 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pg6r\" (UniqueName: \"kubernetes.io/projected/17c999f7-aab6-48d2-afe8-2c317c1825f5-kube-api-access-2pg6r\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758345 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-config\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758376 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-proxy-ca-bundles\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758411 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-client-ca\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758435 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c999f7-aab6-48d2-afe8-2c317c1825f5-serving-cert\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758515 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2jlp\" (UniqueName: \"kubernetes.io/projected/c7705ce2-6b0a-4204-857b-b80448d4b201-kube-api-access-m2jlp\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.859595 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-client-ca\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.859951 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c999f7-aab6-48d2-afe8-2c317c1825f5-serving-cert\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.860066 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2jlp\" (UniqueName: \"kubernetes.io/projected/c7705ce2-6b0a-4204-857b-b80448d4b201-kube-api-access-m2jlp\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.860184 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-config\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.861410 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-client-ca\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.861518 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pg6r\" (UniqueName: \"kubernetes.io/projected/17c999f7-aab6-48d2-afe8-2c317c1825f5-kube-api-access-2pg6r\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.861671 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7705ce2-6b0a-4204-857b-b80448d4b201-serving-cert\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.861774 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-config\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.862180 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-proxy-ca-bundles\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.861364 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-config\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.860434 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-client-ca\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.862811 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-config\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.863109 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-proxy-ca-bundles\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.864465 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-client-ca\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.864916 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7705ce2-6b0a-4204-857b-b80448d4b201-serving-cert\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.864960 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c999f7-aab6-48d2-afe8-2c317c1825f5-serving-cert\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.877594 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2jlp\" (UniqueName: \"kubernetes.io/projected/c7705ce2-6b0a-4204-857b-b80448d4b201-kube-api-access-m2jlp\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.883075 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pg6r\" (UniqueName: \"kubernetes.io/projected/17c999f7-aab6-48d2-afe8-2c317c1825f5-kube-api-access-2pg6r\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.025308 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.036071 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.147643 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" path="/var/lib/kubelet/pods/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa/volumes" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.149479 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba019949-b4c5-4df0-b625-32daf56cabec" path="/var/lib/kubelet/pods/ba019949-b4c5-4df0-b625-32daf56cabec/volumes" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.474862 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78c886458b-c7whn"] Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.480134 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm"] Mar 16 00:12:48 crc kubenswrapper[4983]: W0316 00:12:48.482246 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7705ce2_6b0a_4204_857b_b80448d4b201.slice/crio-3a2e8a3f8b5ab11423514bbab0ca39ce40a537d0189d18befabc970c0081b563 WatchSource:0}: Error finding container 3a2e8a3f8b5ab11423514bbab0ca39ce40a537d0189d18befabc970c0081b563: Status 404 returned error can't find the container with id 3a2e8a3f8b5ab11423514bbab0ca39ce40a537d0189d18befabc970c0081b563 Mar 16 00:12:48 crc kubenswrapper[4983]: W0316 00:12:48.484124 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17c999f7_aab6_48d2_afe8_2c317c1825f5.slice/crio-876216aea4183e045dea0a00510451ec87862184e09e5c477a2acd8429042b4a WatchSource:0}: Error finding container 876216aea4183e045dea0a00510451ec87862184e09e5c477a2acd8429042b4a: Status 404 returned error can't find the container with id 876216aea4183e045dea0a00510451ec87862184e09e5c477a2acd8429042b4a Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.779728 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" event={"ID":"17c999f7-aab6-48d2-afe8-2c317c1825f5","Type":"ContainerStarted","Data":"fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885"} Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.779782 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" event={"ID":"17c999f7-aab6-48d2-afe8-2c317c1825f5","Type":"ContainerStarted","Data":"876216aea4183e045dea0a00510451ec87862184e09e5c477a2acd8429042b4a"} Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.779893 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.781232 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" event={"ID":"c7705ce2-6b0a-4204-857b-b80448d4b201","Type":"ContainerStarted","Data":"81ae8785e149353406399189ed21a1fda919c310e54aefe671726be28185c2ae"} Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.781261 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" event={"ID":"c7705ce2-6b0a-4204-857b-b80448d4b201","Type":"ContainerStarted","Data":"3a2e8a3f8b5ab11423514bbab0ca39ce40a537d0189d18befabc970c0081b563"} Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.781452 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.792851 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.802485 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" podStartSLOduration=3.802465341 podStartE2EDuration="3.802465341s" podCreationTimestamp="2026-03-16 00:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:48.799980804 +0000 UTC m=+377.400079234" watchObservedRunningTime="2026-03-16 00:12:48.802465341 +0000 UTC m=+377.402563771" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.845908 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" podStartSLOduration=3.8458924100000003 podStartE2EDuration="3.84589241s" podCreationTimestamp="2026-03-16 00:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:48.841872672 +0000 UTC m=+377.441971102" watchObservedRunningTime="2026-03-16 00:12:48.84589241 +0000 UTC m=+377.445990840" Mar 16 00:12:49 crc kubenswrapper[4983]: I0316 00:12:49.212034 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:51 crc kubenswrapper[4983]: I0316 00:12:51.540486 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-df6gg"] Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.376122 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78c886458b-c7whn"] Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.377793 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" podUID="c7705ce2-6b0a-4204-857b-b80448d4b201" containerName="controller-manager" containerID="cri-o://81ae8785e149353406399189ed21a1fda919c310e54aefe671726be28185c2ae" gracePeriod=30 Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.874733 4983 generic.go:334] "Generic (PLEG): container finished" podID="c7705ce2-6b0a-4204-857b-b80448d4b201" containerID="81ae8785e149353406399189ed21a1fda919c310e54aefe671726be28185c2ae" exitCode=0 Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.875047 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" event={"ID":"c7705ce2-6b0a-4204-857b-b80448d4b201","Type":"ContainerDied","Data":"81ae8785e149353406399189ed21a1fda919c310e54aefe671726be28185c2ae"} Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.875077 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" event={"ID":"c7705ce2-6b0a-4204-857b-b80448d4b201","Type":"ContainerDied","Data":"3a2e8a3f8b5ab11423514bbab0ca39ce40a537d0189d18befabc970c0081b563"} Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.875091 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a2e8a3f8b5ab11423514bbab0ca39ce40a537d0189d18befabc970c0081b563" Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.912776 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.095937 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7705ce2-6b0a-4204-857b-b80448d4b201-serving-cert\") pod \"c7705ce2-6b0a-4204-857b-b80448d4b201\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.096047 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-client-ca\") pod \"c7705ce2-6b0a-4204-857b-b80448d4b201\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.096231 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2jlp\" (UniqueName: \"kubernetes.io/projected/c7705ce2-6b0a-4204-857b-b80448d4b201-kube-api-access-m2jlp\") pod \"c7705ce2-6b0a-4204-857b-b80448d4b201\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.096391 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-config\") pod \"c7705ce2-6b0a-4204-857b-b80448d4b201\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.096459 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-proxy-ca-bundles\") pod \"c7705ce2-6b0a-4204-857b-b80448d4b201\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.097219 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c7705ce2-6b0a-4204-857b-b80448d4b201" (UID: "c7705ce2-6b0a-4204-857b-b80448d4b201"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.097324 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-config" (OuterVolumeSpecName: "config") pod "c7705ce2-6b0a-4204-857b-b80448d4b201" (UID: "c7705ce2-6b0a-4204-857b-b80448d4b201"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.097578 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-client-ca" (OuterVolumeSpecName: "client-ca") pod "c7705ce2-6b0a-4204-857b-b80448d4b201" (UID: "c7705ce2-6b0a-4204-857b-b80448d4b201"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.103142 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7705ce2-6b0a-4204-857b-b80448d4b201-kube-api-access-m2jlp" (OuterVolumeSpecName: "kube-api-access-m2jlp") pod "c7705ce2-6b0a-4204-857b-b80448d4b201" (UID: "c7705ce2-6b0a-4204-857b-b80448d4b201"). InnerVolumeSpecName "kube-api-access-m2jlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.103911 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7705ce2-6b0a-4204-857b-b80448d4b201-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c7705ce2-6b0a-4204-857b-b80448d4b201" (UID: "c7705ce2-6b0a-4204-857b-b80448d4b201"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.197522 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7705ce2-6b0a-4204-857b-b80448d4b201-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.197573 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.197587 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2jlp\" (UniqueName: \"kubernetes.io/projected/c7705ce2-6b0a-4204-857b-b80448d4b201-kube-api-access-m2jlp\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.197625 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.197636 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.716245 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8"] Mar 16 00:13:07 crc kubenswrapper[4983]: E0316 00:13:07.716689 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7705ce2-6b0a-4204-857b-b80448d4b201" containerName="controller-manager" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.716708 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7705ce2-6b0a-4204-857b-b80448d4b201" containerName="controller-manager" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.716887 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7705ce2-6b0a-4204-857b-b80448d4b201" containerName="controller-manager" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.717514 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.726417 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8"] Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.879944 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.907093 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-client-ca\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.907141 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27455c79-fbad-4784-9ca8-8280c3561064-serving-cert\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.907171 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-config\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.907215 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-proxy-ca-bundles\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.907231 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjld5\" (UniqueName: \"kubernetes.io/projected/27455c79-fbad-4784-9ca8-8280c3561064-kube-api-access-rjld5\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.907396 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78c886458b-c7whn"] Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.912830 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-78c886458b-c7whn"] Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.008175 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-client-ca\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.008419 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27455c79-fbad-4784-9ca8-8280c3561064-serving-cert\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.008450 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-config\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.008502 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjld5\" (UniqueName: \"kubernetes.io/projected/27455c79-fbad-4784-9ca8-8280c3561064-kube-api-access-rjld5\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.008524 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-proxy-ca-bundles\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.009485 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-client-ca\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.009780 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-proxy-ca-bundles\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.010486 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-config\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.013333 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27455c79-fbad-4784-9ca8-8280c3561064-serving-cert\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.037376 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjld5\" (UniqueName: \"kubernetes.io/projected/27455c79-fbad-4784-9ca8-8280c3561064-kube-api-access-rjld5\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.097949 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7705ce2-6b0a-4204-857b-b80448d4b201" path="/var/lib/kubelet/pods/c7705ce2-6b0a-4204-857b-b80448d4b201/volumes" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.333122 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.566156 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8"] Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.886716 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" event={"ID":"27455c79-fbad-4784-9ca8-8280c3561064","Type":"ContainerStarted","Data":"32f011957e1f8eccbea09f7766b3a6d671f89934a7e8fbccd2167a586bec10fb"} Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.887116 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" event={"ID":"27455c79-fbad-4784-9ca8-8280c3561064","Type":"ContainerStarted","Data":"070a7fb6ed9bd86aeb28bd34c2403bfa51a62ba1157341aad1702a3079f4691c"} Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.887572 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.893057 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.903141 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" podStartSLOduration=2.903124467 podStartE2EDuration="2.903124467s" podCreationTimestamp="2026-03-16 00:13:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:08.902471569 +0000 UTC m=+397.502570009" watchObservedRunningTime="2026-03-16 00:13:08.903124467 +0000 UTC m=+397.503222897" Mar 16 00:13:09 crc kubenswrapper[4983]: I0316 00:13:09.744077 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sv5g7"] Mar 16 00:13:09 crc kubenswrapper[4983]: I0316 00:13:09.744303 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sv5g7" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="registry-server" containerID="cri-o://4e33c51822af1207a714633dbe36f0cfbe87f71ecdd87a6317177017c49cda72" gracePeriod=2 Mar 16 00:13:09 crc kubenswrapper[4983]: I0316 00:13:09.912155 4983 generic.go:334] "Generic (PLEG): container finished" podID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerID="4e33c51822af1207a714633dbe36f0cfbe87f71ecdd87a6317177017c49cda72" exitCode=0 Mar 16 00:13:09 crc kubenswrapper[4983]: I0316 00:13:09.912215 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv5g7" event={"ID":"b6bd9bf5-fa59-4fef-9589-7b5865098bd2","Type":"ContainerDied","Data":"4e33c51822af1207a714633dbe36f0cfbe87f71ecdd87a6317177017c49cda72"} Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.196359 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.334308 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4gpf\" (UniqueName: \"kubernetes.io/projected/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-kube-api-access-x4gpf\") pod \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.334474 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-utilities\") pod \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.334528 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-catalog-content\") pod \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.335259 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-utilities" (OuterVolumeSpecName: "utilities") pod "b6bd9bf5-fa59-4fef-9589-7b5865098bd2" (UID: "b6bd9bf5-fa59-4fef-9589-7b5865098bd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.335956 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-txzqn"] Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.336171 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-txzqn" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="registry-server" containerID="cri-o://670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee" gracePeriod=2 Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.345302 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-kube-api-access-x4gpf" (OuterVolumeSpecName: "kube-api-access-x4gpf") pod "b6bd9bf5-fa59-4fef-9589-7b5865098bd2" (UID: "b6bd9bf5-fa59-4fef-9589-7b5865098bd2"). InnerVolumeSpecName "kube-api-access-x4gpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.385220 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6bd9bf5-fa59-4fef-9589-7b5865098bd2" (UID: "b6bd9bf5-fa59-4fef-9589-7b5865098bd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.436075 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.436110 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.436123 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4gpf\" (UniqueName: \"kubernetes.io/projected/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-kube-api-access-x4gpf\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.749977 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.919614 4983 generic.go:334] "Generic (PLEG): container finished" podID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerID="670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee" exitCode=0 Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.919682 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.919714 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerDied","Data":"670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee"} Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.919776 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerDied","Data":"6ff5c36eac345013e6cc957efaa73b943a59621eebe35f0b43b2431024b1cecb"} Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.919798 4983 scope.go:117] "RemoveContainer" containerID="670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.922267 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv5g7" event={"ID":"b6bd9bf5-fa59-4fef-9589-7b5865098bd2","Type":"ContainerDied","Data":"aae8cc96a35a149bdefbed630e67440f7417544a5d2fbe7864d479595393b42e"} Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.922298 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.941303 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqln2\" (UniqueName: \"kubernetes.io/projected/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-kube-api-access-bqln2\") pod \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.941409 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-utilities\") pod \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.941536 4983 scope.go:117] "RemoveContainer" containerID="0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.941585 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-catalog-content\") pod \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.942559 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-utilities" (OuterVolumeSpecName: "utilities") pod "ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" (UID: "ca55ad69-3f41-4d0c-8f86-83a583ff6fe4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.944438 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-kube-api-access-bqln2" (OuterVolumeSpecName: "kube-api-access-bqln2") pod "ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" (UID: "ca55ad69-3f41-4d0c-8f86-83a583ff6fe4"). InnerVolumeSpecName "kube-api-access-bqln2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.962079 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sv5g7"] Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.964140 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sv5g7"] Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.979382 4983 scope.go:117] "RemoveContainer" containerID="10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.997650 4983 scope.go:117] "RemoveContainer" containerID="670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee" Mar 16 00:13:10 crc kubenswrapper[4983]: E0316 00:13:10.998478 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee\": container with ID starting with 670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee not found: ID does not exist" containerID="670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.998549 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee"} err="failed to get container status \"670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee\": rpc error: code = NotFound desc = could not find container \"670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee\": container with ID starting with 670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee not found: ID does not exist" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.998581 4983 scope.go:117] "RemoveContainer" containerID="0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf" Mar 16 00:13:10 crc kubenswrapper[4983]: E0316 00:13:10.999308 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf\": container with ID starting with 0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf not found: ID does not exist" containerID="0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.999341 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf"} err="failed to get container status \"0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf\": rpc error: code = NotFound desc = could not find container \"0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf\": container with ID starting with 0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf not found: ID does not exist" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.999366 4983 scope.go:117] "RemoveContainer" containerID="10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c" Mar 16 00:13:10 crc kubenswrapper[4983]: E0316 00:13:10.999811 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c\": container with ID starting with 10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c not found: ID does not exist" containerID="10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.999860 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c"} err="failed to get container status \"10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c\": rpc error: code = NotFound desc = could not find container \"10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c\": container with ID starting with 10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c not found: ID does not exist" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.999892 4983 scope.go:117] "RemoveContainer" containerID="4e33c51822af1207a714633dbe36f0cfbe87f71ecdd87a6317177017c49cda72" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.001373 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" (UID: "ca55ad69-3f41-4d0c-8f86-83a583ff6fe4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.012555 4983 scope.go:117] "RemoveContainer" containerID="ff00e7152e69c4aeaaff4ebd02f8e9bc3011a8b0e33817b723307cd7fa5fe455" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.031022 4983 scope.go:117] "RemoveContainer" containerID="7455d52b296ac2dc05d5dba007a96face87721af18e58d348eedd55fbc4a2082" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.043406 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.043433 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.043443 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqln2\" (UniqueName: \"kubernetes.io/projected/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-kube-api-access-bqln2\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.255694 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-txzqn"] Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.259045 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-txzqn"] Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.107389 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" path="/var/lib/kubelet/pods/b6bd9bf5-fa59-4fef-9589-7b5865098bd2/volumes" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.108780 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" path="/var/lib/kubelet/pods/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4/volumes" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.135881 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjc2w"] Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.136410 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kjc2w" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="registry-server" containerID="cri-o://840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb" gracePeriod=2 Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.576909 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.734251 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qx9g"] Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.734467 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7qx9g" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="registry-server" containerID="cri-o://41a3bc85f34bbfc428451e248ff3adf551bec5d147a73350a6adc1cc78464c00" gracePeriod=2 Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.762872 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-utilities\") pod \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.763326 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8x8w\" (UniqueName: \"kubernetes.io/projected/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-kube-api-access-r8x8w\") pod \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.763481 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-catalog-content\") pod \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.763629 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-utilities" (OuterVolumeSpecName: "utilities") pod "00a4a2a2-9263-4b76-8294-fa9c4d918fc7" (UID: "00a4a2a2-9263-4b76-8294-fa9c4d918fc7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.764087 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.776993 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-kube-api-access-r8x8w" (OuterVolumeSpecName: "kube-api-access-r8x8w") pod "00a4a2a2-9263-4b76-8294-fa9c4d918fc7" (UID: "00a4a2a2-9263-4b76-8294-fa9c4d918fc7"). InnerVolumeSpecName "kube-api-access-r8x8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.788644 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00a4a2a2-9263-4b76-8294-fa9c4d918fc7" (UID: "00a4a2a2-9263-4b76-8294-fa9c4d918fc7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.866440 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8x8w\" (UniqueName: \"kubernetes.io/projected/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-kube-api-access-r8x8w\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.866490 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.944918 4983 generic.go:334] "Generic (PLEG): container finished" podID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerID="840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb" exitCode=0 Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.944999 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerDied","Data":"840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb"} Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.945031 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerDied","Data":"edfb4c106db9ff156e89258c7be736e143b651348ae2eece9c28a73c16f1a791"} Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.945051 4983 scope.go:117] "RemoveContainer" containerID="840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.945162 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.949345 4983 generic.go:334] "Generic (PLEG): container finished" podID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerID="41a3bc85f34bbfc428451e248ff3adf551bec5d147a73350a6adc1cc78464c00" exitCode=0 Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.949391 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerDied","Data":"41a3bc85f34bbfc428451e248ff3adf551bec5d147a73350a6adc1cc78464c00"} Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.972981 4983 scope.go:117] "RemoveContainer" containerID="86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.992360 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjc2w"] Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.997961 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjc2w"] Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.006271 4983 scope.go:117] "RemoveContainer" containerID="27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.032668 4983 scope.go:117] "RemoveContainer" containerID="840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb" Mar 16 00:13:13 crc kubenswrapper[4983]: E0316 00:13:13.033030 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb\": container with ID starting with 840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb not found: ID does not exist" containerID="840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.033056 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb"} err="failed to get container status \"840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb\": rpc error: code = NotFound desc = could not find container \"840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb\": container with ID starting with 840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb not found: ID does not exist" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.033075 4983 scope.go:117] "RemoveContainer" containerID="86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404" Mar 16 00:13:13 crc kubenswrapper[4983]: E0316 00:13:13.033884 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404\": container with ID starting with 86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404 not found: ID does not exist" containerID="86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.033927 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404"} err="failed to get container status \"86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404\": rpc error: code = NotFound desc = could not find container \"86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404\": container with ID starting with 86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404 not found: ID does not exist" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.033957 4983 scope.go:117] "RemoveContainer" containerID="27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32" Mar 16 00:13:13 crc kubenswrapper[4983]: E0316 00:13:13.034265 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32\": container with ID starting with 27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32 not found: ID does not exist" containerID="27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.034286 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32"} err="failed to get container status \"27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32\": rpc error: code = NotFound desc = could not find container \"27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32\": container with ID starting with 27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32 not found: ID does not exist" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.140937 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.271737 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-utilities\") pod \"7bc03354-3cba-40ac-a894-844d6ae1ee69\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.271834 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8nr2\" (UniqueName: \"kubernetes.io/projected/7bc03354-3cba-40ac-a894-844d6ae1ee69-kube-api-access-x8nr2\") pod \"7bc03354-3cba-40ac-a894-844d6ae1ee69\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.271941 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-catalog-content\") pod \"7bc03354-3cba-40ac-a894-844d6ae1ee69\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.272594 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-utilities" (OuterVolumeSpecName: "utilities") pod "7bc03354-3cba-40ac-a894-844d6ae1ee69" (UID: "7bc03354-3cba-40ac-a894-844d6ae1ee69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.275063 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc03354-3cba-40ac-a894-844d6ae1ee69-kube-api-access-x8nr2" (OuterVolumeSpecName: "kube-api-access-x8nr2") pod "7bc03354-3cba-40ac-a894-844d6ae1ee69" (UID: "7bc03354-3cba-40ac-a894-844d6ae1ee69"). InnerVolumeSpecName "kube-api-access-x8nr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.374392 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8nr2\" (UniqueName: \"kubernetes.io/projected/7bc03354-3cba-40ac-a894-844d6ae1ee69-kube-api-access-x8nr2\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.374427 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.425721 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7bc03354-3cba-40ac-a894-844d6ae1ee69" (UID: "7bc03354-3cba-40ac-a894-844d6ae1ee69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.474991 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.957444 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerDied","Data":"66c02382f4884cf7432e8b1dd2d9aae721248d87c7cd3a1bce60e42991bb56c4"} Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.957836 4983 scope.go:117] "RemoveContainer" containerID="41a3bc85f34bbfc428451e248ff3adf551bec5d147a73350a6adc1cc78464c00" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.957490 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.980232 4983 scope.go:117] "RemoveContainer" containerID="5315d03c3a0c66cd9452cd1be2631735c8666c6ac21135b6c44ab5b65cd08883" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.987286 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qx9g"] Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.997128 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7qx9g"] Mar 16 00:13:14 crc kubenswrapper[4983]: I0316 00:13:14.013954 4983 scope.go:117] "RemoveContainer" containerID="b9712062cb37f4ba2339e9dc2def8ff36e2a54d5fce9ebcc83e68db1e8c9e216" Mar 16 00:13:14 crc kubenswrapper[4983]: I0316 00:13:14.101814 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" path="/var/lib/kubelet/pods/00a4a2a2-9263-4b76-8294-fa9c4d918fc7/volumes" Mar 16 00:13:14 crc kubenswrapper[4983]: I0316 00:13:14.102574 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" path="/var/lib/kubelet/pods/7bc03354-3cba-40ac-a894-844d6ae1ee69/volumes" Mar 16 00:13:16 crc kubenswrapper[4983]: I0316 00:13:16.567456 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerName="oauth-openshift" containerID="cri-o://5ab76987f0d86f28eac9406e16f1acebdbf300a37f32a1aa45d218eb2af1f3e4" gracePeriod=15 Mar 16 00:13:16 crc kubenswrapper[4983]: I0316 00:13:16.975954 4983 generic.go:334] "Generic (PLEG): container finished" podID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerID="5ab76987f0d86f28eac9406e16f1acebdbf300a37f32a1aa45d218eb2af1f3e4" exitCode=0 Mar 16 00:13:16 crc kubenswrapper[4983]: I0316 00:13:16.976069 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" event={"ID":"0fd829d1-ad38-407e-a576-43aa5a6ca8f2","Type":"ContainerDied","Data":"5ab76987f0d86f28eac9406e16f1acebdbf300a37f32a1aa45d218eb2af1f3e4"} Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.060402 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217025 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-serving-cert\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217092 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-service-ca\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217202 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xzhx\" (UniqueName: \"kubernetes.io/projected/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-kube-api-access-5xzhx\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217268 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-login\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217303 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-ocp-branding-template\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217341 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-cliconfig\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217373 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-dir\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217413 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-trusted-ca-bundle\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217462 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-provider-selection\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217515 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-error\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217550 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-idp-0-file-data\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217588 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-policies\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217633 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-router-certs\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217662 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-session\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.218623 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.219336 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.221107 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.221238 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.221258 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.224452 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.224896 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.225145 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.225564 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.226528 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.232225 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.233739 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-kube-api-access-5xzhx" (OuterVolumeSpecName: "kube-api-access-5xzhx") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "kube-api-access-5xzhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.232923 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.233553 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319305 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319341 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319352 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319364 4983 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319378 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319391 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319406 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319418 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319430 4983 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319441 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319454 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319465 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319477 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319489 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xzhx\" (UniqueName: \"kubernetes.io/projected/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-kube-api-access-5xzhx\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.985698 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" event={"ID":"0fd829d1-ad38-407e-a576-43aa5a6ca8f2","Type":"ContainerDied","Data":"992aee5b0776d510c59718dbe65f51126e10a5ddde1021826a4cd33845179277"} Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.985800 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.985855 4983 scope.go:117] "RemoveContainer" containerID="5ab76987f0d86f28eac9406e16f1acebdbf300a37f32a1aa45d218eb2af1f3e4" Mar 16 00:13:18 crc kubenswrapper[4983]: I0316 00:13:18.036524 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-df6gg"] Mar 16 00:13:18 crc kubenswrapper[4983]: I0316 00:13:18.039028 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-df6gg"] Mar 16 00:13:18 crc kubenswrapper[4983]: I0316 00:13:18.098672 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" path="/var/lib/kubelet/pods/0fd829d1-ad38-407e-a576-43aa5a6ca8f2/volumes" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.935609 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vr64c"] Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936053 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936064 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936072 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerName="oauth-openshift" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936078 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerName="oauth-openshift" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936085 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936090 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936104 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936110 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936118 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936124 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936133 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936139 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936147 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936153 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936160 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936165 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936173 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936178 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936188 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936193 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936202 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936208 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936218 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936223 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936230 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936235 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936319 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936330 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936341 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936349 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerName="oauth-openshift" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936375 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936760 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.952496 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vr64c"] Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062154 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pj9h\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-kube-api-access-4pj9h\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062212 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-registry-tls\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062235 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75689cb1-d790-48a1-91b5-6880d37ecb86-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062299 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75689cb1-d790-48a1-91b5-6880d37ecb86-trusted-ca\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062351 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75689cb1-d790-48a1-91b5-6880d37ecb86-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062392 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-bound-sa-token\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062443 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75689cb1-d790-48a1-91b5-6880d37ecb86-registry-certificates\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062485 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.088902 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.163929 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75689cb1-d790-48a1-91b5-6880d37ecb86-registry-certificates\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.164002 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pj9h\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-kube-api-access-4pj9h\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.164026 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-registry-tls\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.164041 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75689cb1-d790-48a1-91b5-6880d37ecb86-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.164056 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75689cb1-d790-48a1-91b5-6880d37ecb86-trusted-ca\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.164206 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75689cb1-d790-48a1-91b5-6880d37ecb86-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.164234 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-bound-sa-token\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.165623 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75689cb1-d790-48a1-91b5-6880d37ecb86-registry-certificates\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.165922 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75689cb1-d790-48a1-91b5-6880d37ecb86-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.167309 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75689cb1-d790-48a1-91b5-6880d37ecb86-trusted-ca\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.170009 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75689cb1-d790-48a1-91b5-6880d37ecb86-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.171053 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-registry-tls\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.184478 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-bound-sa-token\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.184724 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pj9h\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-kube-api-access-4pj9h\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.317122 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.695652 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vr64c"] Mar 16 00:13:21 crc kubenswrapper[4983]: W0316 00:13:21.706115 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75689cb1_d790_48a1_91b5_6880d37ecb86.slice/crio-45c6882943a40a289a3173f68c712aed8808c792d721366601a6f8d66e4c0441 WatchSource:0}: Error finding container 45c6882943a40a289a3173f68c712aed8808c792d721366601a6f8d66e4c0441: Status 404 returned error can't find the container with id 45c6882943a40a289a3173f68c712aed8808c792d721366601a6f8d66e4c0441 Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.027642 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" event={"ID":"75689cb1-d790-48a1-91b5-6880d37ecb86","Type":"ContainerStarted","Data":"e46ca17635d18ccc135e4e82d37d32003610abc69d68fd1fcef8227c4b1cd844"} Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.027716 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" event={"ID":"75689cb1-d790-48a1-91b5-6880d37ecb86","Type":"ContainerStarted","Data":"45c6882943a40a289a3173f68c712aed8808c792d721366601a6f8d66e4c0441"} Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.027971 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.058585 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" podStartSLOduration=2.058559523 podStartE2EDuration="2.058559523s" podCreationTimestamp="2026-03-16 00:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:22.052258118 +0000 UTC m=+410.652356588" watchObservedRunningTime="2026-03-16 00:13:22.058559523 +0000 UTC m=+410.658657983" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.730216 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5d79794f9d-7s5jx"] Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.731266 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.736190 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.736687 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.736736 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.738085 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.738423 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.739061 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.739247 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.739743 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.739959 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.740461 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.740787 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.750835 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.751310 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.757595 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.764318 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.765618 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d79794f9d-7s5jx"] Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.783680 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.783768 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.783842 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.783959 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784006 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784050 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784083 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-session\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784110 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784142 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5whj4\" (UniqueName: \"kubernetes.io/projected/061c8b83-1d07-4b74-9689-a86e3363a770-kube-api-access-5whj4\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784224 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-audit-policies\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784274 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/061c8b83-1d07-4b74-9689-a86e3363a770-audit-dir\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784299 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784337 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-login\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784368 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-error\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885465 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885537 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/061c8b83-1d07-4b74-9689-a86e3363a770-audit-dir\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885592 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-login\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885624 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-error\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885682 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885712 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885789 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885811 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885832 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885882 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885906 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-session\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885952 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885977 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5whj4\" (UniqueName: \"kubernetes.io/projected/061c8b83-1d07-4b74-9689-a86e3363a770-kube-api-access-5whj4\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.886037 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-audit-policies\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.887078 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/061c8b83-1d07-4b74-9689-a86e3363a770-audit-dir\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.887550 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.887868 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.888864 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-audit-policies\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.889065 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.893123 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.893404 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.894023 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.894586 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-error\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.895380 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-login\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.897478 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-session\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.904163 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.904228 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.905849 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5whj4\" (UniqueName: \"kubernetes.io/projected/061c8b83-1d07-4b74-9689-a86e3363a770-kube-api-access-5whj4\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:23 crc kubenswrapper[4983]: I0316 00:13:23.055269 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:23 crc kubenswrapper[4983]: I0316 00:13:23.448541 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:13:23 crc kubenswrapper[4983]: I0316 00:13:23.448650 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:13:23 crc kubenswrapper[4983]: I0316 00:13:23.466981 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d79794f9d-7s5jx"] Mar 16 00:13:24 crc kubenswrapper[4983]: I0316 00:13:24.043823 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" event={"ID":"061c8b83-1d07-4b74-9689-a86e3363a770","Type":"ContainerStarted","Data":"31fe1c68d7249884a5bb82e1cd9a84ebe1594199f7c8385d5d8319c48f1031a1"} Mar 16 00:13:24 crc kubenswrapper[4983]: I0316 00:13:24.044131 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" event={"ID":"061c8b83-1d07-4b74-9689-a86e3363a770","Type":"ContainerStarted","Data":"978b3636703efe358b96e42970e9c1b57c6101833ceb06764199ae65aaedb149"} Mar 16 00:13:24 crc kubenswrapper[4983]: I0316 00:13:24.044542 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:24 crc kubenswrapper[4983]: I0316 00:13:24.082463 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" podStartSLOduration=33.082432313 podStartE2EDuration="33.082432313s" podCreationTimestamp="2026-03-16 00:12:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:24.07234536 +0000 UTC m=+412.672443800" watchObservedRunningTime="2026-03-16 00:13:24.082432313 +0000 UTC m=+412.682530783" Mar 16 00:13:24 crc kubenswrapper[4983]: I0316 00:13:24.306540 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:25 crc kubenswrapper[4983]: I0316 00:13:25.983202 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm"] Mar 16 00:13:25 crc kubenswrapper[4983]: I0316 00:13:25.984503 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" podUID="17c999f7-aab6-48d2-afe8-2c317c1825f5" containerName="route-controller-manager" containerID="cri-o://fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885" gracePeriod=30 Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.500625 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.641817 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-config\") pod \"17c999f7-aab6-48d2-afe8-2c317c1825f5\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.641907 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c999f7-aab6-48d2-afe8-2c317c1825f5-serving-cert\") pod \"17c999f7-aab6-48d2-afe8-2c317c1825f5\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.641937 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-client-ca\") pod \"17c999f7-aab6-48d2-afe8-2c317c1825f5\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.641962 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pg6r\" (UniqueName: \"kubernetes.io/projected/17c999f7-aab6-48d2-afe8-2c317c1825f5-kube-api-access-2pg6r\") pod \"17c999f7-aab6-48d2-afe8-2c317c1825f5\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.642814 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-config" (OuterVolumeSpecName: "config") pod "17c999f7-aab6-48d2-afe8-2c317c1825f5" (UID: "17c999f7-aab6-48d2-afe8-2c317c1825f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.643648 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-client-ca" (OuterVolumeSpecName: "client-ca") pod "17c999f7-aab6-48d2-afe8-2c317c1825f5" (UID: "17c999f7-aab6-48d2-afe8-2c317c1825f5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.646271 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17c999f7-aab6-48d2-afe8-2c317c1825f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "17c999f7-aab6-48d2-afe8-2c317c1825f5" (UID: "17c999f7-aab6-48d2-afe8-2c317c1825f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.646646 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17c999f7-aab6-48d2-afe8-2c317c1825f5-kube-api-access-2pg6r" (OuterVolumeSpecName: "kube-api-access-2pg6r") pod "17c999f7-aab6-48d2-afe8-2c317c1825f5" (UID: "17c999f7-aab6-48d2-afe8-2c317c1825f5"). InnerVolumeSpecName "kube-api-access-2pg6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.743483 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.743518 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c999f7-aab6-48d2-afe8-2c317c1825f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.743527 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.743536 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pg6r\" (UniqueName: \"kubernetes.io/projected/17c999f7-aab6-48d2-afe8-2c317c1825f5-kube-api-access-2pg6r\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.066071 4983 generic.go:334] "Generic (PLEG): container finished" podID="17c999f7-aab6-48d2-afe8-2c317c1825f5" containerID="fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885" exitCode=0 Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.066138 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" event={"ID":"17c999f7-aab6-48d2-afe8-2c317c1825f5","Type":"ContainerDied","Data":"fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885"} Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.066183 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.066203 4983 scope.go:117] "RemoveContainer" containerID="fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.066188 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" event={"ID":"17c999f7-aab6-48d2-afe8-2c317c1825f5","Type":"ContainerDied","Data":"876216aea4183e045dea0a00510451ec87862184e09e5c477a2acd8429042b4a"} Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.092243 4983 scope.go:117] "RemoveContainer" containerID="fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885" Mar 16 00:13:27 crc kubenswrapper[4983]: E0316 00:13:27.093285 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885\": container with ID starting with fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885 not found: ID does not exist" containerID="fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.093351 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885"} err="failed to get container status \"fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885\": rpc error: code = NotFound desc = could not find container \"fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885\": container with ID starting with fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885 not found: ID does not exist" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.106124 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm"] Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.110235 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm"] Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.734575 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq"] Mar 16 00:13:27 crc kubenswrapper[4983]: E0316 00:13:27.734914 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c999f7-aab6-48d2-afe8-2c317c1825f5" containerName="route-controller-manager" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.734939 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c999f7-aab6-48d2-afe8-2c317c1825f5" containerName="route-controller-manager" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.735150 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c999f7-aab6-48d2-afe8-2c317c1825f5" containerName="route-controller-manager" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.735911 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.739471 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.739491 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.739522 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.739479 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.740131 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.740161 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.753581 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq"] Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.858047 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-config\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.858135 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-client-ca\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.858192 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvtk8\" (UniqueName: \"kubernetes.io/projected/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-kube-api-access-nvtk8\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.858227 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-serving-cert\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.960088 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-config\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.960180 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-client-ca\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.960243 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvtk8\" (UniqueName: \"kubernetes.io/projected/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-kube-api-access-nvtk8\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.960294 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-serving-cert\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.962041 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-client-ca\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.965979 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-serving-cert\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.966104 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-config\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.983072 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvtk8\" (UniqueName: \"kubernetes.io/projected/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-kube-api-access-nvtk8\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:28 crc kubenswrapper[4983]: I0316 00:13:28.062108 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:28 crc kubenswrapper[4983]: I0316 00:13:28.104518 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17c999f7-aab6-48d2-afe8-2c317c1825f5" path="/var/lib/kubelet/pods/17c999f7-aab6-48d2-afe8-2c317c1825f5/volumes" Mar 16 00:13:28 crc kubenswrapper[4983]: I0316 00:13:28.435538 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq"] Mar 16 00:13:28 crc kubenswrapper[4983]: W0316 00:13:28.440249 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9278fbf1_bc49_4361_b1c8_4b63798e5fc7.slice/crio-51b1a655b22ec79d30073adfceb6012a3c86c1a7d2dae88fd50f188bf2c1e195 WatchSource:0}: Error finding container 51b1a655b22ec79d30073adfceb6012a3c86c1a7d2dae88fd50f188bf2c1e195: Status 404 returned error can't find the container with id 51b1a655b22ec79d30073adfceb6012a3c86c1a7d2dae88fd50f188bf2c1e195 Mar 16 00:13:29 crc kubenswrapper[4983]: I0316 00:13:29.077564 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" event={"ID":"9278fbf1-bc49-4361-b1c8-4b63798e5fc7","Type":"ContainerStarted","Data":"206cee71b0e6278aac3686b3eb2ce2742fc538494d3d0c96acd65e3c8a7c0155"} Mar 16 00:13:29 crc kubenswrapper[4983]: I0316 00:13:29.077957 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" event={"ID":"9278fbf1-bc49-4361-b1c8-4b63798e5fc7","Type":"ContainerStarted","Data":"51b1a655b22ec79d30073adfceb6012a3c86c1a7d2dae88fd50f188bf2c1e195"} Mar 16 00:13:29 crc kubenswrapper[4983]: I0316 00:13:29.077982 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:29 crc kubenswrapper[4983]: I0316 00:13:29.082573 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:29 crc kubenswrapper[4983]: I0316 00:13:29.095161 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" podStartSLOduration=4.095141636 podStartE2EDuration="4.095141636s" podCreationTimestamp="2026-03-16 00:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:29.091365377 +0000 UTC m=+417.691463827" watchObservedRunningTime="2026-03-16 00:13:29.095141636 +0000 UTC m=+417.695240066" Mar 16 00:13:41 crc kubenswrapper[4983]: I0316 00:13:41.326046 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:41 crc kubenswrapper[4983]: I0316 00:13:41.400642 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sm6x"] Mar 16 00:13:53 crc kubenswrapper[4983]: I0316 00:13:53.448607 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:13:53 crc kubenswrapper[4983]: I0316 00:13:53.449445 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.145536 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560334-5n4gc"] Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.149362 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.151940 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.152285 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-5n4gc"] Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.152688 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.153569 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.294535 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbf4c\" (UniqueName: \"kubernetes.io/projected/272489bc-7bd4-4421-930d-150816da83b8-kube-api-access-sbf4c\") pod \"auto-csr-approver-29560334-5n4gc\" (UID: \"272489bc-7bd4-4421-930d-150816da83b8\") " pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.396289 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbf4c\" (UniqueName: \"kubernetes.io/projected/272489bc-7bd4-4421-930d-150816da83b8-kube-api-access-sbf4c\") pod \"auto-csr-approver-29560334-5n4gc\" (UID: \"272489bc-7bd4-4421-930d-150816da83b8\") " pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.421155 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbf4c\" (UniqueName: \"kubernetes.io/projected/272489bc-7bd4-4421-930d-150816da83b8-kube-api-access-sbf4c\") pod \"auto-csr-approver-29560334-5n4gc\" (UID: \"272489bc-7bd4-4421-930d-150816da83b8\") " pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.479435 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:01 crc kubenswrapper[4983]: I0316 00:14:00.940902 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-5n4gc"] Mar 16 00:14:01 crc kubenswrapper[4983]: I0316 00:14:01.290324 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" event={"ID":"272489bc-7bd4-4421-930d-150816da83b8","Type":"ContainerStarted","Data":"85bcf12afe7ba41a760a19fd98d39c6447bcc6bc7f82af3fb2a49fc3e57ff35b"} Mar 16 00:14:02 crc kubenswrapper[4983]: I0316 00:14:02.295660 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" event={"ID":"272489bc-7bd4-4421-930d-150816da83b8","Type":"ContainerStarted","Data":"5fe0de833b2b27c1bfe835628ef9c6dca727580c2781fda123b15ad86663176a"} Mar 16 00:14:03 crc kubenswrapper[4983]: I0316 00:14:03.307351 4983 generic.go:334] "Generic (PLEG): container finished" podID="272489bc-7bd4-4421-930d-150816da83b8" containerID="5fe0de833b2b27c1bfe835628ef9c6dca727580c2781fda123b15ad86663176a" exitCode=0 Mar 16 00:14:03 crc kubenswrapper[4983]: I0316 00:14:03.308616 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" event={"ID":"272489bc-7bd4-4421-930d-150816da83b8","Type":"ContainerDied","Data":"5fe0de833b2b27c1bfe835628ef9c6dca727580c2781fda123b15ad86663176a"} Mar 16 00:14:04 crc kubenswrapper[4983]: I0316 00:14:04.562983 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:04 crc kubenswrapper[4983]: I0316 00:14:04.577583 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbf4c\" (UniqueName: \"kubernetes.io/projected/272489bc-7bd4-4421-930d-150816da83b8-kube-api-access-sbf4c\") pod \"272489bc-7bd4-4421-930d-150816da83b8\" (UID: \"272489bc-7bd4-4421-930d-150816da83b8\") " Mar 16 00:14:04 crc kubenswrapper[4983]: I0316 00:14:04.621868 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/272489bc-7bd4-4421-930d-150816da83b8-kube-api-access-sbf4c" (OuterVolumeSpecName: "kube-api-access-sbf4c") pod "272489bc-7bd4-4421-930d-150816da83b8" (UID: "272489bc-7bd4-4421-930d-150816da83b8"). InnerVolumeSpecName "kube-api-access-sbf4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:04 crc kubenswrapper[4983]: I0316 00:14:04.679004 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbf4c\" (UniqueName: \"kubernetes.io/projected/272489bc-7bd4-4421-930d-150816da83b8-kube-api-access-sbf4c\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:05 crc kubenswrapper[4983]: I0316 00:14:05.156696 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560328-sngnj"] Mar 16 00:14:05 crc kubenswrapper[4983]: I0316 00:14:05.160178 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560328-sngnj"] Mar 16 00:14:05 crc kubenswrapper[4983]: I0316 00:14:05.321433 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" event={"ID":"272489bc-7bd4-4421-930d-150816da83b8","Type":"ContainerDied","Data":"85bcf12afe7ba41a760a19fd98d39c6447bcc6bc7f82af3fb2a49fc3e57ff35b"} Mar 16 00:14:05 crc kubenswrapper[4983]: I0316 00:14:05.321480 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85bcf12afe7ba41a760a19fd98d39c6447bcc6bc7f82af3fb2a49fc3e57ff35b" Mar 16 00:14:05 crc kubenswrapper[4983]: I0316 00:14:05.321484 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.100175 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da42bf3-da76-4db7-9653-f2f08567084f" path="/var/lib/kubelet/pods/9da42bf3-da76-4db7-9653-f2f08567084f/volumes" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.435486 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" podUID="0a099f86-8967-4361-bbbf-4dfa8385d2f2" containerName="registry" containerID="cri-o://b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae" gracePeriod=30 Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.819353 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.910687 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-bound-sa-token\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.910995 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.911039 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9n5s\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-kube-api-access-x9n5s\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.911067 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-certificates\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.911099 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-trusted-ca\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.911121 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a099f86-8967-4361-bbbf-4dfa8385d2f2-installation-pull-secrets\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.911181 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a099f86-8967-4361-bbbf-4dfa8385d2f2-ca-trust-extracted\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.911235 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-tls\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.912632 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.912863 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.918165 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a099f86-8967-4361-bbbf-4dfa8385d2f2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.918270 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.921784 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.922934 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-kube-api-access-x9n5s" (OuterVolumeSpecName: "kube-api-access-x9n5s") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "kube-api-access-x9n5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.929333 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.940429 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a099f86-8967-4361-bbbf-4dfa8385d2f2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.012997 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.013052 4983 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a099f86-8967-4361-bbbf-4dfa8385d2f2-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.013110 4983 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a099f86-8967-4361-bbbf-4dfa8385d2f2-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.013131 4983 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.013150 4983 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.013170 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9n5s\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-kube-api-access-x9n5s\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.013189 4983 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.175175 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hsgsl"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.175468 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hsgsl" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="registry-server" containerID="cri-o://ac7cc066be48efae22a5403ae6443b38e54eef1955947eb62836ee914ccddbe0" gracePeriod=30 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.185327 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vxnxc"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.185564 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vxnxc" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="registry-server" containerID="cri-o://6c4af783e4992498667061cde045dc4adfbc3a2bb0d78f02971a5d9ab47a3c4e" gracePeriod=30 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.193573 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tj49l"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.193807 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" containerID="cri-o://44c3726454774541747806cc8af6b91f789705e9d629f767cefb9962a9601f36" gracePeriod=30 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.201073 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b68d7"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.201539 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b68d7" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="registry-server" containerID="cri-o://c7ebe52d3884cc61ad7b5da10a6ceb1f5607237405cd86dac48708bdae4fcb7d" gracePeriod=30 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.210839 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56c2t"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.211070 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-56c2t" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="registry-server" containerID="cri-o://ebb5761c41d710f62351daf37faae0c364dec6494e019b8e9b1984f3f34d560f" gracePeriod=30 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.231363 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pvjtd"] Mar 16 00:14:07 crc kubenswrapper[4983]: E0316 00:14:07.232083 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272489bc-7bd4-4421-930d-150816da83b8" containerName="oc" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.232109 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="272489bc-7bd4-4421-930d-150816da83b8" containerName="oc" Mar 16 00:14:07 crc kubenswrapper[4983]: E0316 00:14:07.232142 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a099f86-8967-4361-bbbf-4dfa8385d2f2" containerName="registry" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.232151 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a099f86-8967-4361-bbbf-4dfa8385d2f2" containerName="registry" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.232599 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="272489bc-7bd4-4421-930d-150816da83b8" containerName="oc" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.232631 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a099f86-8967-4361-bbbf-4dfa8385d2f2" containerName="registry" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.233243 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.256443 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pvjtd"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.320689 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46c9f8c6-7d08-47e7-866d-7f359e8683be-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.320795 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwllv\" (UniqueName: \"kubernetes.io/projected/46c9f8c6-7d08-47e7-866d-7f359e8683be-kube-api-access-wwllv\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.320836 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46c9f8c6-7d08-47e7-866d-7f359e8683be-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.343823 4983 generic.go:334] "Generic (PLEG): container finished" podID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerID="ac7cc066be48efae22a5403ae6443b38e54eef1955947eb62836ee914ccddbe0" exitCode=0 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.343907 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerDied","Data":"ac7cc066be48efae22a5403ae6443b38e54eef1955947eb62836ee914ccddbe0"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.353058 4983 generic.go:334] "Generic (PLEG): container finished" podID="87a722ee-1078-41fd-bd5e-96981b43652d" containerID="44c3726454774541747806cc8af6b91f789705e9d629f767cefb9962a9601f36" exitCode=0 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.353149 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" event={"ID":"87a722ee-1078-41fd-bd5e-96981b43652d","Type":"ContainerDied","Data":"44c3726454774541747806cc8af6b91f789705e9d629f767cefb9962a9601f36"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.355377 4983 generic.go:334] "Generic (PLEG): container finished" podID="0a099f86-8967-4361-bbbf-4dfa8385d2f2" containerID="b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae" exitCode=0 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.355468 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" event={"ID":"0a099f86-8967-4361-bbbf-4dfa8385d2f2","Type":"ContainerDied","Data":"b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.355486 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.355502 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" event={"ID":"0a099f86-8967-4361-bbbf-4dfa8385d2f2","Type":"ContainerDied","Data":"e0fb578aeb69cdf828d396d7abaed36aa77a72628836d8b9f23c76675c3ee11f"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.355526 4983 scope.go:117] "RemoveContainer" containerID="b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.364472 4983 generic.go:334] "Generic (PLEG): container finished" podID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerID="6c4af783e4992498667061cde045dc4adfbc3a2bb0d78f02971a5d9ab47a3c4e" exitCode=0 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.364543 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxnxc" event={"ID":"f617dbbc-f757-49b9-b8c6-7d0c07cb197e","Type":"ContainerDied","Data":"6c4af783e4992498667061cde045dc4adfbc3a2bb0d78f02971a5d9ab47a3c4e"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.369797 4983 generic.go:334] "Generic (PLEG): container finished" podID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerID="c7ebe52d3884cc61ad7b5da10a6ceb1f5607237405cd86dac48708bdae4fcb7d" exitCode=0 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.369860 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b68d7" event={"ID":"cbebf69d-773f-4829-a4ec-e443d52ef275","Type":"ContainerDied","Data":"c7ebe52d3884cc61ad7b5da10a6ceb1f5607237405cd86dac48708bdae4fcb7d"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.375564 4983 generic.go:334] "Generic (PLEG): container finished" podID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerID="ebb5761c41d710f62351daf37faae0c364dec6494e019b8e9b1984f3f34d560f" exitCode=0 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.375612 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerDied","Data":"ebb5761c41d710f62351daf37faae0c364dec6494e019b8e9b1984f3f34d560f"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.422049 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46c9f8c6-7d08-47e7-866d-7f359e8683be-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.422132 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46c9f8c6-7d08-47e7-866d-7f359e8683be-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.422173 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwllv\" (UniqueName: \"kubernetes.io/projected/46c9f8c6-7d08-47e7-866d-7f359e8683be-kube-api-access-wwllv\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.423264 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46c9f8c6-7d08-47e7-866d-7f359e8683be-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.428378 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46c9f8c6-7d08-47e7-866d-7f359e8683be-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.438323 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwllv\" (UniqueName: \"kubernetes.io/projected/46c9f8c6-7d08-47e7-866d-7f359e8683be-kube-api-access-wwllv\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.671638 4983 scope.go:117] "RemoveContainer" containerID="b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae" Mar 16 00:14:07 crc kubenswrapper[4983]: E0316 00:14:07.673189 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae\": container with ID starting with b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae not found: ID does not exist" containerID="b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.673245 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae"} err="failed to get container status \"b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae\": rpc error: code = NotFound desc = could not find container \"b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae\": container with ID starting with b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae not found: ID does not exist" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.676261 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.691818 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.710086 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sm6x"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.713418 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sm6x"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.728602 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-catalog-content\") pod \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.728686 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbbl2\" (UniqueName: \"kubernetes.io/projected/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-kube-api-access-xbbl2\") pod \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.728791 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-utilities\") pod \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.730052 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-utilities" (OuterVolumeSpecName: "utilities") pod "f617dbbc-f757-49b9-b8c6-7d0c07cb197e" (UID: "f617dbbc-f757-49b9-b8c6-7d0c07cb197e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.730650 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.734036 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-kube-api-access-xbbl2" (OuterVolumeSpecName: "kube-api-access-xbbl2") pod "f617dbbc-f757-49b9-b8c6-7d0c07cb197e" (UID: "f617dbbc-f757-49b9-b8c6-7d0c07cb197e"). InnerVolumeSpecName "kube-api-access-xbbl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.788108 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.797359 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.825369 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.826809 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831278 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-trusted-ca\") pod \"87a722ee-1078-41fd-bd5e-96981b43652d\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831330 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-utilities\") pod \"8fd3d4ca-4839-4327-8121-fe6ba21051da\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831396 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-operator-metrics\") pod \"87a722ee-1078-41fd-bd5e-96981b43652d\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831448 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8w9f\" (UniqueName: \"kubernetes.io/projected/87a722ee-1078-41fd-bd5e-96981b43652d-kube-api-access-l8w9f\") pod \"87a722ee-1078-41fd-bd5e-96981b43652d\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831499 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm28s\" (UniqueName: \"kubernetes.io/projected/8fd3d4ca-4839-4327-8121-fe6ba21051da-kube-api-access-cm28s\") pod \"8fd3d4ca-4839-4327-8121-fe6ba21051da\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831536 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-catalog-content\") pod \"8fd3d4ca-4839-4327-8121-fe6ba21051da\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831792 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbbl2\" (UniqueName: \"kubernetes.io/projected/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-kube-api-access-xbbl2\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.833343 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-utilities" (OuterVolumeSpecName: "utilities") pod "8fd3d4ca-4839-4327-8121-fe6ba21051da" (UID: "8fd3d4ca-4839-4327-8121-fe6ba21051da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.834827 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "87a722ee-1078-41fd-bd5e-96981b43652d" (UID: "87a722ee-1078-41fd-bd5e-96981b43652d"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.839300 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87a722ee-1078-41fd-bd5e-96981b43652d-kube-api-access-l8w9f" (OuterVolumeSpecName: "kube-api-access-l8w9f") pod "87a722ee-1078-41fd-bd5e-96981b43652d" (UID: "87a722ee-1078-41fd-bd5e-96981b43652d"). InnerVolumeSpecName "kube-api-access-l8w9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.841401 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "87a722ee-1078-41fd-bd5e-96981b43652d" (UID: "87a722ee-1078-41fd-bd5e-96981b43652d"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.843386 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd3d4ca-4839-4327-8121-fe6ba21051da-kube-api-access-cm28s" (OuterVolumeSpecName: "kube-api-access-cm28s") pod "8fd3d4ca-4839-4327-8121-fe6ba21051da" (UID: "8fd3d4ca-4839-4327-8121-fe6ba21051da"). InnerVolumeSpecName "kube-api-access-cm28s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.863069 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f617dbbc-f757-49b9-b8c6-7d0c07cb197e" (UID: "f617dbbc-f757-49b9-b8c6-7d0c07cb197e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.890638 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fd3d4ca-4839-4327-8121-fe6ba21051da" (UID: "8fd3d4ca-4839-4327-8121-fe6ba21051da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.935496 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-catalog-content\") pod \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.935873 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-catalog-content\") pod \"cbebf69d-773f-4829-a4ec-e443d52ef275\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.935987 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kmd9\" (UniqueName: \"kubernetes.io/projected/cbebf69d-773f-4829-a4ec-e443d52ef275-kube-api-access-6kmd9\") pod \"cbebf69d-773f-4829-a4ec-e443d52ef275\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.936070 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msk49\" (UniqueName: \"kubernetes.io/projected/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-kube-api-access-msk49\") pod \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.936107 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-utilities\") pod \"cbebf69d-773f-4829-a4ec-e443d52ef275\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.936127 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-utilities\") pod \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.936860 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-utilities" (OuterVolumeSpecName: "utilities") pod "cbebf69d-773f-4829-a4ec-e443d52ef275" (UID: "cbebf69d-773f-4829-a4ec-e443d52ef275"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937193 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937206 4983 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937217 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937227 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937260 4983 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937270 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8w9f\" (UniqueName: \"kubernetes.io/projected/87a722ee-1078-41fd-bd5e-96981b43652d-kube-api-access-l8w9f\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937279 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm28s\" (UniqueName: \"kubernetes.io/projected/8fd3d4ca-4839-4327-8121-fe6ba21051da-kube-api-access-cm28s\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937288 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937312 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-utilities" (OuterVolumeSpecName: "utilities") pod "8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" (UID: "8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.940282 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-kube-api-access-msk49" (OuterVolumeSpecName: "kube-api-access-msk49") pod "8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" (UID: "8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07"). InnerVolumeSpecName "kube-api-access-msk49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.940454 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbebf69d-773f-4829-a4ec-e443d52ef275-kube-api-access-6kmd9" (OuterVolumeSpecName: "kube-api-access-6kmd9") pod "cbebf69d-773f-4829-a4ec-e443d52ef275" (UID: "cbebf69d-773f-4829-a4ec-e443d52ef275"). InnerVolumeSpecName "kube-api-access-6kmd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.965745 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbebf69d-773f-4829-a4ec-e443d52ef275" (UID: "cbebf69d-773f-4829-a4ec-e443d52ef275"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.038307 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msk49\" (UniqueName: \"kubernetes.io/projected/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-kube-api-access-msk49\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.038371 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.038391 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.038408 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kmd9\" (UniqueName: \"kubernetes.io/projected/cbebf69d-773f-4829-a4ec-e443d52ef275-kube-api-access-6kmd9\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.054483 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" (UID: "8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.100223 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a099f86-8967-4361-bbbf-4dfa8385d2f2" path="/var/lib/kubelet/pods/0a099f86-8967-4361-bbbf-4dfa8385d2f2/volumes" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.139888 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.161255 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pvjtd"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.382984 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerDied","Data":"839c30c9cbe107a7c9f0dd7cc6175826e37c3a950a4d5a9be034e934974f0bc3"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.383036 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.383042 4983 scope.go:117] "RemoveContainer" containerID="ebb5761c41d710f62351daf37faae0c364dec6494e019b8e9b1984f3f34d560f" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.386137 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.385940 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerDied","Data":"de21ac29d1b3f85746eecc6275790d886e43e62e160f35ab6e888afb27d08a5c"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.387602 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" event={"ID":"87a722ee-1078-41fd-bd5e-96981b43652d","Type":"ContainerDied","Data":"1965cf54da33760615e034ca9db488c5481e59caf0aa16831ccaefaf972dbc39"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.387638 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.388965 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" event={"ID":"46c9f8c6-7d08-47e7-866d-7f359e8683be","Type":"ContainerStarted","Data":"04c976cb885ffa3503044e3e837dd223eba7c9aa0c7e27dd3416f595aad5a275"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.388994 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" event={"ID":"46c9f8c6-7d08-47e7-866d-7f359e8683be","Type":"ContainerStarted","Data":"ece972a7fbeda3de85367618c930e653ffab76f99d167fdc32f7a49ed0000814"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.389574 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.391145 4983 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pvjtd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.76:8080/healthz\": dial tcp 10.217.0.76:8080: connect: connection refused" start-of-body= Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.391201 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" podUID="46c9f8c6-7d08-47e7-866d-7f359e8683be" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.76:8080/healthz\": dial tcp 10.217.0.76:8080: connect: connection refused" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.395020 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxnxc" event={"ID":"f617dbbc-f757-49b9-b8c6-7d0c07cb197e","Type":"ContainerDied","Data":"560de43e4286295f7d460584c287dcc2fb86a8274cec5a292335c9438faa954b"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.395176 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.397994 4983 scope.go:117] "RemoveContainer" containerID="4fd735d9c2a8af79e35b41af9d3f84d5c4faeb3f496099e9f47662ac9f90becf" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.400339 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b68d7" event={"ID":"cbebf69d-773f-4829-a4ec-e443d52ef275","Type":"ContainerDied","Data":"5fbb0356673aa199b061055bf122df8d3c4f8bc1dc9d0dbf904e99d7873ede45"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.400380 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.409275 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56c2t"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.413991 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-56c2t"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.419524 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hsgsl"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.423236 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hsgsl"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.430064 4983 scope.go:117] "RemoveContainer" containerID="0601a98e47222baf45860438cfc29d0447fa64cf46cd7bead9a6ef97f07beb9c" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.434049 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tj49l"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.439575 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tj49l"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.444666 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b68d7"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.448081 4983 scope.go:117] "RemoveContainer" containerID="ac7cc066be48efae22a5403ae6443b38e54eef1955947eb62836ee914ccddbe0" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.451547 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b68d7"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.457990 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" podStartSLOduration=1.457972547 podStartE2EDuration="1.457972547s" podCreationTimestamp="2026-03-16 00:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:14:08.45550802 +0000 UTC m=+457.055606450" watchObservedRunningTime="2026-03-16 00:14:08.457972547 +0000 UTC m=+457.058070987" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.470241 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vxnxc"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.474377 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vxnxc"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.476859 4983 scope.go:117] "RemoveContainer" containerID="de0cee5fa65ae8acc06500ed4f7bfd1b7fc45fe51327cba7b49afb9439e0134f" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.492909 4983 scope.go:117] "RemoveContainer" containerID="2c8a01779fdf7320586832f975808a3323314fc1dee647ee11f25e6ca498d9a4" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.505285 4983 scope.go:117] "RemoveContainer" containerID="44c3726454774541747806cc8af6b91f789705e9d629f767cefb9962a9601f36" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.517427 4983 scope.go:117] "RemoveContainer" containerID="6c4af783e4992498667061cde045dc4adfbc3a2bb0d78f02971a5d9ab47a3c4e" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.530418 4983 scope.go:117] "RemoveContainer" containerID="210bd7f5ab48e451b18cd186b0e612a0157714bee428a4d39d25cdd92c0f3eb0" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.543495 4983 scope.go:117] "RemoveContainer" containerID="1fc80a9e4fb01c05cb775f45190ece9037ca337a03452dd8abf5a08dd242d1da" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.559971 4983 scope.go:117] "RemoveContainer" containerID="c7ebe52d3884cc61ad7b5da10a6ceb1f5607237405cd86dac48708bdae4fcb7d" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.576654 4983 scope.go:117] "RemoveContainer" containerID="b6acaa7dffa774e191a9bf342869bf819b4d039ee2bd145b14e03704f80e4abc" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.597845 4983 scope.go:117] "RemoveContainer" containerID="b832baa9ad863d92bef0f4bd68918c75a656cd7a0c7e14efd5e15110ac3d6de8" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.192913 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hgt5w"] Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193381 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193394 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193406 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193412 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193421 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193427 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193440 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193445 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193452 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193458 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193466 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193473 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193482 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193487 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193494 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193500 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193507 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193513 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193521 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193526 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193533 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193539 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193546 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193551 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193560 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193567 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193660 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193681 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193691 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193701 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193709 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.194557 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.198461 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.218082 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgt5w"] Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.255315 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-utilities\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.255380 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2snn8\" (UniqueName: \"kubernetes.io/projected/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-kube-api-access-2snn8\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.255398 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-catalog-content\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.356782 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2snn8\" (UniqueName: \"kubernetes.io/projected/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-kube-api-access-2snn8\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.356841 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-catalog-content\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.356913 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-utilities\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.357454 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-utilities\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.357523 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-catalog-content\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.373969 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2snn8\" (UniqueName: \"kubernetes.io/projected/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-kube-api-access-2snn8\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.412330 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.511249 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.785972 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rxmlr"] Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.786844 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.794326 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.800796 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rxmlr"] Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.863947 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4e15b89-9659-49da-bccb-c826ebceeb93-utilities\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.863987 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4e15b89-9659-49da-bccb-c826ebceeb93-catalog-content\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.864021 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxvd\" (UniqueName: \"kubernetes.io/projected/b4e15b89-9659-49da-bccb-c826ebceeb93-kube-api-access-9cxvd\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.938928 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgt5w"] Mar 16 00:14:09 crc kubenswrapper[4983]: W0316 00:14:09.946818 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4cf6a4e_082d_473f_8640_b1eb9b6591d2.slice/crio-3bdf27aed1d212bcce22928b3d69b78bec030ff348adbec53a55366b6c0d3541 WatchSource:0}: Error finding container 3bdf27aed1d212bcce22928b3d69b78bec030ff348adbec53a55366b6c0d3541: Status 404 returned error can't find the container with id 3bdf27aed1d212bcce22928b3d69b78bec030ff348adbec53a55366b6c0d3541 Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.965658 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxvd\" (UniqueName: \"kubernetes.io/projected/b4e15b89-9659-49da-bccb-c826ebceeb93-kube-api-access-9cxvd\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.965775 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4e15b89-9659-49da-bccb-c826ebceeb93-utilities\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.965791 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4e15b89-9659-49da-bccb-c826ebceeb93-catalog-content\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.966290 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4e15b89-9659-49da-bccb-c826ebceeb93-catalog-content\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.966777 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4e15b89-9659-49da-bccb-c826ebceeb93-utilities\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.983057 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxvd\" (UniqueName: \"kubernetes.io/projected/b4e15b89-9659-49da-bccb-c826ebceeb93-kube-api-access-9cxvd\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.101104 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" path="/var/lib/kubelet/pods/87a722ee-1078-41fd-bd5e-96981b43652d/volumes" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.102421 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" path="/var/lib/kubelet/pods/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07/volumes" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.103523 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" path="/var/lib/kubelet/pods/8fd3d4ca-4839-4327-8121-fe6ba21051da/volumes" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.105075 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" path="/var/lib/kubelet/pods/cbebf69d-773f-4829-a4ec-e443d52ef275/volumes" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.105942 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" path="/var/lib/kubelet/pods/f617dbbc-f757-49b9-b8c6-7d0c07cb197e/volumes" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.117359 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.415657 4983 generic.go:334] "Generic (PLEG): container finished" podID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerID="0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50" exitCode=0 Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.415782 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgt5w" event={"ID":"b4cf6a4e-082d-473f-8640-b1eb9b6591d2","Type":"ContainerDied","Data":"0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50"} Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.415828 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgt5w" event={"ID":"b4cf6a4e-082d-473f-8640-b1eb9b6591d2","Type":"ContainerStarted","Data":"3bdf27aed1d212bcce22928b3d69b78bec030ff348adbec53a55366b6c0d3541"} Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.487385 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rxmlr"] Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.421962 4983 generic.go:334] "Generic (PLEG): container finished" podID="b4e15b89-9659-49da-bccb-c826ebceeb93" containerID="1ed184c41fbbe0cfdd26246e20d2e13e062bc90755594cdea61691aeae60a359" exitCode=0 Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.422061 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxmlr" event={"ID":"b4e15b89-9659-49da-bccb-c826ebceeb93","Type":"ContainerDied","Data":"1ed184c41fbbe0cfdd26246e20d2e13e062bc90755594cdea61691aeae60a359"} Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.422336 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxmlr" event={"ID":"b4e15b89-9659-49da-bccb-c826ebceeb93","Type":"ContainerStarted","Data":"59ad535c85447fad3a3389620dc6bbddc089f8ff0f21ad7bbae0c5487b26da3b"} Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.424686 4983 generic.go:334] "Generic (PLEG): container finished" podID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerID="361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d" exitCode=0 Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.424742 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgt5w" event={"ID":"b4cf6a4e-082d-473f-8640-b1eb9b6591d2","Type":"ContainerDied","Data":"361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d"} Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.590629 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8hjdk"] Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.593434 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.595995 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.596337 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hjdk"] Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.686946 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628d0b6e-5772-4af2-aa28-28cc15bd5d60-catalog-content\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.687004 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdrnf\" (UniqueName: \"kubernetes.io/projected/628d0b6e-5772-4af2-aa28-28cc15bd5d60-kube-api-access-kdrnf\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.687225 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628d0b6e-5772-4af2-aa28-28cc15bd5d60-utilities\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.788599 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628d0b6e-5772-4af2-aa28-28cc15bd5d60-utilities\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.788689 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628d0b6e-5772-4af2-aa28-28cc15bd5d60-catalog-content\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.788719 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdrnf\" (UniqueName: \"kubernetes.io/projected/628d0b6e-5772-4af2-aa28-28cc15bd5d60-kube-api-access-kdrnf\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.789077 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628d0b6e-5772-4af2-aa28-28cc15bd5d60-utilities\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.789157 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628d0b6e-5772-4af2-aa28-28cc15bd5d60-catalog-content\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.807622 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdrnf\" (UniqueName: \"kubernetes.io/projected/628d0b6e-5772-4af2-aa28-28cc15bd5d60-kube-api-access-kdrnf\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.912855 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.202698 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-95rsh"] Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.204162 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.207654 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.209357 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95rsh"] Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.307893 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-utilities\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.307943 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-catalog-content\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.308094 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vxbg\" (UniqueName: \"kubernetes.io/projected/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-kube-api-access-6vxbg\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.356049 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hjdk"] Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.409455 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vxbg\" (UniqueName: \"kubernetes.io/projected/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-kube-api-access-6vxbg\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.409510 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-utilities\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.409545 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-catalog-content\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.410040 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-catalog-content\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.410093 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-utilities\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.426862 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vxbg\" (UniqueName: \"kubernetes.io/projected/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-kube-api-access-6vxbg\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.433983 4983 generic.go:334] "Generic (PLEG): container finished" podID="b4e15b89-9659-49da-bccb-c826ebceeb93" containerID="5760f0b4afd0b8ec7ad7ef8450f92118348dc093bd56a33aab992dad1ec1b8b1" exitCode=0 Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.434044 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxmlr" event={"ID":"b4e15b89-9659-49da-bccb-c826ebceeb93","Type":"ContainerDied","Data":"5760f0b4afd0b8ec7ad7ef8450f92118348dc093bd56a33aab992dad1ec1b8b1"} Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.438099 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgt5w" event={"ID":"b4cf6a4e-082d-473f-8640-b1eb9b6591d2","Type":"ContainerStarted","Data":"f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010"} Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.439224 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hjdk" event={"ID":"628d0b6e-5772-4af2-aa28-28cc15bd5d60","Type":"ContainerStarted","Data":"952ad432709f7e1f20c8ed834fb41075eb679c955619d8842a55ef23f4eb92d8"} Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.519700 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.909912 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hgt5w" podStartSLOduration=2.446534504 podStartE2EDuration="3.909897179s" podCreationTimestamp="2026-03-16 00:14:09 +0000 UTC" firstStartedPulling="2026-03-16 00:14:10.417021644 +0000 UTC m=+459.017120074" lastFinishedPulling="2026-03-16 00:14:11.880384319 +0000 UTC m=+460.480482749" observedRunningTime="2026-03-16 00:14:12.475160412 +0000 UTC m=+461.075258862" watchObservedRunningTime="2026-03-16 00:14:12.909897179 +0000 UTC m=+461.509995609" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.912065 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95rsh"] Mar 16 00:14:12 crc kubenswrapper[4983]: W0316 00:14:12.916911 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4deeaa90_9b0b_47cb_a8bf_4b2524a736a8.slice/crio-86dd4e8c5f987a5e21ab707ebc1543aabcbdb1e0c21e81b09fab6bfb600f6840 WatchSource:0}: Error finding container 86dd4e8c5f987a5e21ab707ebc1543aabcbdb1e0c21e81b09fab6bfb600f6840: Status 404 returned error can't find the container with id 86dd4e8c5f987a5e21ab707ebc1543aabcbdb1e0c21e81b09fab6bfb600f6840 Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.446011 4983 generic.go:334] "Generic (PLEG): container finished" podID="628d0b6e-5772-4af2-aa28-28cc15bd5d60" containerID="03812425a0066a0cb8753010a44bc79a1afe1757459aafa6771375ce0923d821" exitCode=0 Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.446107 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hjdk" event={"ID":"628d0b6e-5772-4af2-aa28-28cc15bd5d60","Type":"ContainerDied","Data":"03812425a0066a0cb8753010a44bc79a1afe1757459aafa6771375ce0923d821"} Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.451639 4983 generic.go:334] "Generic (PLEG): container finished" podID="4deeaa90-9b0b-47cb-a8bf-4b2524a736a8" containerID="103a145016d5f85202efc125ca9860d61c1a070c8c81a59e7c305c82b79be272" exitCode=0 Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.451709 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rsh" event={"ID":"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8","Type":"ContainerDied","Data":"103a145016d5f85202efc125ca9860d61c1a070c8c81a59e7c305c82b79be272"} Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.451731 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rsh" event={"ID":"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8","Type":"ContainerStarted","Data":"86dd4e8c5f987a5e21ab707ebc1543aabcbdb1e0c21e81b09fab6bfb600f6840"} Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.460916 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxmlr" event={"ID":"b4e15b89-9659-49da-bccb-c826ebceeb93","Type":"ContainerStarted","Data":"819e219f8fa52b7d9a4b13b5f4a608060200d7a98f5caf2d3bc3fc96d9268e66"} Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.530893 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rxmlr" podStartSLOduration=3.087410551 podStartE2EDuration="4.530875975s" podCreationTimestamp="2026-03-16 00:14:09 +0000 UTC" firstStartedPulling="2026-03-16 00:14:11.423768807 +0000 UTC m=+460.023867237" lastFinishedPulling="2026-03-16 00:14:12.867234231 +0000 UTC m=+461.467332661" observedRunningTime="2026-03-16 00:14:13.496046639 +0000 UTC m=+462.096145089" watchObservedRunningTime="2026-03-16 00:14:13.530875975 +0000 UTC m=+462.130974405" Mar 16 00:14:14 crc kubenswrapper[4983]: I0316 00:14:14.467312 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hjdk" event={"ID":"628d0b6e-5772-4af2-aa28-28cc15bd5d60","Type":"ContainerStarted","Data":"b826077cc335e020e32973b7b2699a884131448fddff91106fcd824b02c99b9b"} Mar 16 00:14:14 crc kubenswrapper[4983]: I0316 00:14:14.468971 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rsh" event={"ID":"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8","Type":"ContainerStarted","Data":"c368f85fd8ac6934d02456de71a6a8584c04ea955f47fac632bb3564121c63b3"} Mar 16 00:14:15 crc kubenswrapper[4983]: I0316 00:14:15.478083 4983 generic.go:334] "Generic (PLEG): container finished" podID="628d0b6e-5772-4af2-aa28-28cc15bd5d60" containerID="b826077cc335e020e32973b7b2699a884131448fddff91106fcd824b02c99b9b" exitCode=0 Mar 16 00:14:15 crc kubenswrapper[4983]: I0316 00:14:15.478157 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hjdk" event={"ID":"628d0b6e-5772-4af2-aa28-28cc15bd5d60","Type":"ContainerDied","Data":"b826077cc335e020e32973b7b2699a884131448fddff91106fcd824b02c99b9b"} Mar 16 00:14:15 crc kubenswrapper[4983]: I0316 00:14:15.480212 4983 generic.go:334] "Generic (PLEG): container finished" podID="4deeaa90-9b0b-47cb-a8bf-4b2524a736a8" containerID="c368f85fd8ac6934d02456de71a6a8584c04ea955f47fac632bb3564121c63b3" exitCode=0 Mar 16 00:14:15 crc kubenswrapper[4983]: I0316 00:14:15.480237 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rsh" event={"ID":"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8","Type":"ContainerDied","Data":"c368f85fd8ac6934d02456de71a6a8584c04ea955f47fac632bb3564121c63b3"} Mar 16 00:14:16 crc kubenswrapper[4983]: I0316 00:14:16.487613 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rsh" event={"ID":"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8","Type":"ContainerStarted","Data":"b8f171ee59fb652ce06fb93a8fbdcf904f55f167d019cfec9066291fccdf630d"} Mar 16 00:14:16 crc kubenswrapper[4983]: I0316 00:14:16.489608 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hjdk" event={"ID":"628d0b6e-5772-4af2-aa28-28cc15bd5d60","Type":"ContainerStarted","Data":"7e889058e0047e4b7eba362a74c3c0fc59c3cc9c26af5e15dc1668da9a757f57"} Mar 16 00:14:16 crc kubenswrapper[4983]: I0316 00:14:16.506795 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-95rsh" podStartSLOduration=2.081783237 podStartE2EDuration="4.50677788s" podCreationTimestamp="2026-03-16 00:14:12 +0000 UTC" firstStartedPulling="2026-03-16 00:14:13.465007216 +0000 UTC m=+462.065105656" lastFinishedPulling="2026-03-16 00:14:15.890001869 +0000 UTC m=+464.490100299" observedRunningTime="2026-03-16 00:14:16.506387509 +0000 UTC m=+465.106485949" watchObservedRunningTime="2026-03-16 00:14:16.50677788 +0000 UTC m=+465.106876310" Mar 16 00:14:16 crc kubenswrapper[4983]: I0316 00:14:16.527627 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8hjdk" podStartSLOduration=3.095277733 podStartE2EDuration="5.527611085s" podCreationTimestamp="2026-03-16 00:14:11 +0000 UTC" firstStartedPulling="2026-03-16 00:14:13.448616781 +0000 UTC m=+462.048715211" lastFinishedPulling="2026-03-16 00:14:15.880950133 +0000 UTC m=+464.481048563" observedRunningTime="2026-03-16 00:14:16.526875175 +0000 UTC m=+465.126973605" watchObservedRunningTime="2026-03-16 00:14:16.527611085 +0000 UTC m=+465.127709515" Mar 16 00:14:19 crc kubenswrapper[4983]: I0316 00:14:19.511845 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:19 crc kubenswrapper[4983]: I0316 00:14:19.527782 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:19 crc kubenswrapper[4983]: I0316 00:14:19.573491 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:20 crc kubenswrapper[4983]: I0316 00:14:20.118233 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:20 crc kubenswrapper[4983]: I0316 00:14:20.118320 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:20 crc kubenswrapper[4983]: I0316 00:14:20.158799 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:20 crc kubenswrapper[4983]: I0316 00:14:20.548140 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:20 crc kubenswrapper[4983]: I0316 00:14:20.560392 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:21 crc kubenswrapper[4983]: I0316 00:14:21.913086 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:21 crc kubenswrapper[4983]: I0316 00:14:21.913148 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:22 crc kubenswrapper[4983]: I0316 00:14:22.520231 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:22 crc kubenswrapper[4983]: I0316 00:14:22.520273 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:22 crc kubenswrapper[4983]: I0316 00:14:22.566129 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:22 crc kubenswrapper[4983]: I0316 00:14:22.957866 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8hjdk" podUID="628d0b6e-5772-4af2-aa28-28cc15bd5d60" containerName="registry-server" probeResult="failure" output=< Mar 16 00:14:22 crc kubenswrapper[4983]: timeout: failed to connect service ":50051" within 1s Mar 16 00:14:22 crc kubenswrapper[4983]: > Mar 16 00:14:23 crc kubenswrapper[4983]: I0316 00:14:23.448040 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:14:23 crc kubenswrapper[4983]: I0316 00:14:23.449198 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:14:23 crc kubenswrapper[4983]: I0316 00:14:23.449296 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:14:23 crc kubenswrapper[4983]: I0316 00:14:23.449742 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a285b65caa99c1c0ba0c4deb9dc06267b724d77153088ef275808d94e8acc41c"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:14:23 crc kubenswrapper[4983]: I0316 00:14:23.449826 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://a285b65caa99c1c0ba0c4deb9dc06267b724d77153088ef275808d94e8acc41c" gracePeriod=600 Mar 16 00:14:23 crc kubenswrapper[4983]: I0316 00:14:23.579811 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:24 crc kubenswrapper[4983]: I0316 00:14:24.542432 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="a285b65caa99c1c0ba0c4deb9dc06267b724d77153088ef275808d94e8acc41c" exitCode=0 Mar 16 00:14:24 crc kubenswrapper[4983]: I0316 00:14:24.542519 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"a285b65caa99c1c0ba0c4deb9dc06267b724d77153088ef275808d94e8acc41c"} Mar 16 00:14:24 crc kubenswrapper[4983]: I0316 00:14:24.542710 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"e056a1a5ca459a72f9b9d946a266c3a0b85c436266062d48d109316238bb9f2f"} Mar 16 00:14:24 crc kubenswrapper[4983]: I0316 00:14:24.542738 4983 scope.go:117] "RemoveContainer" containerID="25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383" Mar 16 00:14:31 crc kubenswrapper[4983]: I0316 00:14:31.977148 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:32 crc kubenswrapper[4983]: I0316 00:14:32.041028 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.140180 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb"] Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.143270 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.146194 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.146504 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.151218 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb"] Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.299474 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-config-volume\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.299937 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58p85\" (UniqueName: \"kubernetes.io/projected/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-kube-api-access-58p85\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.300015 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-secret-volume\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.401028 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-config-volume\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.401075 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58p85\" (UniqueName: \"kubernetes.io/projected/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-kube-api-access-58p85\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.401113 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-secret-volume\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.402565 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-config-volume\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.408487 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-secret-volume\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.417741 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58p85\" (UniqueName: \"kubernetes.io/projected/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-kube-api-access-58p85\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.465564 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.885976 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb"] Mar 16 00:15:01 crc kubenswrapper[4983]: I0316 00:15:01.763435 4983 generic.go:334] "Generic (PLEG): container finished" podID="1ff90261-e4e9-4ff3-86a2-6a0274e9454e" containerID="def262889d3b7fe5b27fe4d3deb37511089ea35b2bea84f3d8f7a004f334c93b" exitCode=0 Mar 16 00:15:01 crc kubenswrapper[4983]: I0316 00:15:01.763824 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" event={"ID":"1ff90261-e4e9-4ff3-86a2-6a0274e9454e","Type":"ContainerDied","Data":"def262889d3b7fe5b27fe4d3deb37511089ea35b2bea84f3d8f7a004f334c93b"} Mar 16 00:15:01 crc kubenswrapper[4983]: I0316 00:15:01.763865 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" event={"ID":"1ff90261-e4e9-4ff3-86a2-6a0274e9454e","Type":"ContainerStarted","Data":"32ccd45dbf3d3c04feb7c91ef200fb408fbc1cead4c28f3dd2087b6a592da66f"} Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.032855 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.132697 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-config-volume\") pod \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.132805 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58p85\" (UniqueName: \"kubernetes.io/projected/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-kube-api-access-58p85\") pod \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.132849 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-secret-volume\") pod \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.133326 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-config-volume" (OuterVolumeSpecName: "config-volume") pod "1ff90261-e4e9-4ff3-86a2-6a0274e9454e" (UID: "1ff90261-e4e9-4ff3-86a2-6a0274e9454e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.138868 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1ff90261-e4e9-4ff3-86a2-6a0274e9454e" (UID: "1ff90261-e4e9-4ff3-86a2-6a0274e9454e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.140065 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-kube-api-access-58p85" (OuterVolumeSpecName: "kube-api-access-58p85") pod "1ff90261-e4e9-4ff3-86a2-6a0274e9454e" (UID: "1ff90261-e4e9-4ff3-86a2-6a0274e9454e"). InnerVolumeSpecName "kube-api-access-58p85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.234430 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58p85\" (UniqueName: \"kubernetes.io/projected/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-kube-api-access-58p85\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.234471 4983 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.234490 4983 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.775538 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" event={"ID":"1ff90261-e4e9-4ff3-86a2-6a0274e9454e","Type":"ContainerDied","Data":"32ccd45dbf3d3c04feb7c91ef200fb408fbc1cead4c28f3dd2087b6a592da66f"} Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.775575 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.775579 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32ccd45dbf3d3c04feb7c91ef200fb408fbc1cead4c28f3dd2087b6a592da66f" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.151042 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560336-6d4qf"] Mar 16 00:16:00 crc kubenswrapper[4983]: E0316 00:16:00.152025 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff90261-e4e9-4ff3-86a2-6a0274e9454e" containerName="collect-profiles" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.152047 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff90261-e4e9-4ff3-86a2-6a0274e9454e" containerName="collect-profiles" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.152252 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff90261-e4e9-4ff3-86a2-6a0274e9454e" containerName="collect-profiles" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.152867 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.155460 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-6d4qf"] Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.158149 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.158274 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.159880 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.275152 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrrqc\" (UniqueName: \"kubernetes.io/projected/b56bb064-30c4-4aaf-a4d2-c81006425b62-kube-api-access-vrrqc\") pod \"auto-csr-approver-29560336-6d4qf\" (UID: \"b56bb064-30c4-4aaf-a4d2-c81006425b62\") " pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.375924 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrrqc\" (UniqueName: \"kubernetes.io/projected/b56bb064-30c4-4aaf-a4d2-c81006425b62-kube-api-access-vrrqc\") pod \"auto-csr-approver-29560336-6d4qf\" (UID: \"b56bb064-30c4-4aaf-a4d2-c81006425b62\") " pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.400607 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrrqc\" (UniqueName: \"kubernetes.io/projected/b56bb064-30c4-4aaf-a4d2-c81006425b62-kube-api-access-vrrqc\") pod \"auto-csr-approver-29560336-6d4qf\" (UID: \"b56bb064-30c4-4aaf-a4d2-c81006425b62\") " pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.514384 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.714896 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-6d4qf"] Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.723493 4983 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:16:01 crc kubenswrapper[4983]: I0316 00:16:01.157499 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" event={"ID":"b56bb064-30c4-4aaf-a4d2-c81006425b62","Type":"ContainerStarted","Data":"62abfec3868b503dddcadd449b49bba04126f6906e655f7ab31e9614fffc7705"} Mar 16 00:16:02 crc kubenswrapper[4983]: I0316 00:16:02.168201 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" event={"ID":"b56bb064-30c4-4aaf-a4d2-c81006425b62","Type":"ContainerStarted","Data":"a092715a78836d6cc7d08c15d4c8579198cd91313410de0ab11035815df03f19"} Mar 16 00:16:02 crc kubenswrapper[4983]: I0316 00:16:02.189438 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" podStartSLOduration=1.166087173 podStartE2EDuration="2.189416839s" podCreationTimestamp="2026-03-16 00:16:00 +0000 UTC" firstStartedPulling="2026-03-16 00:16:00.723258025 +0000 UTC m=+569.323356455" lastFinishedPulling="2026-03-16 00:16:01.746587691 +0000 UTC m=+570.346686121" observedRunningTime="2026-03-16 00:16:02.189235764 +0000 UTC m=+570.789334194" watchObservedRunningTime="2026-03-16 00:16:02.189416839 +0000 UTC m=+570.789515269" Mar 16 00:16:03 crc kubenswrapper[4983]: I0316 00:16:03.174447 4983 generic.go:334] "Generic (PLEG): container finished" podID="b56bb064-30c4-4aaf-a4d2-c81006425b62" containerID="a092715a78836d6cc7d08c15d4c8579198cd91313410de0ab11035815df03f19" exitCode=0 Mar 16 00:16:03 crc kubenswrapper[4983]: I0316 00:16:03.174497 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" event={"ID":"b56bb064-30c4-4aaf-a4d2-c81006425b62","Type":"ContainerDied","Data":"a092715a78836d6cc7d08c15d4c8579198cd91313410de0ab11035815df03f19"} Mar 16 00:16:04 crc kubenswrapper[4983]: I0316 00:16:04.384469 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:04 crc kubenswrapper[4983]: I0316 00:16:04.425729 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrrqc\" (UniqueName: \"kubernetes.io/projected/b56bb064-30c4-4aaf-a4d2-c81006425b62-kube-api-access-vrrqc\") pod \"b56bb064-30c4-4aaf-a4d2-c81006425b62\" (UID: \"b56bb064-30c4-4aaf-a4d2-c81006425b62\") " Mar 16 00:16:04 crc kubenswrapper[4983]: I0316 00:16:04.429827 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b56bb064-30c4-4aaf-a4d2-c81006425b62-kube-api-access-vrrqc" (OuterVolumeSpecName: "kube-api-access-vrrqc") pod "b56bb064-30c4-4aaf-a4d2-c81006425b62" (UID: "b56bb064-30c4-4aaf-a4d2-c81006425b62"). InnerVolumeSpecName "kube-api-access-vrrqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:16:04 crc kubenswrapper[4983]: I0316 00:16:04.527242 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrrqc\" (UniqueName: \"kubernetes.io/projected/b56bb064-30c4-4aaf-a4d2-c81006425b62-kube-api-access-vrrqc\") on node \"crc\" DevicePath \"\"" Mar 16 00:16:05 crc kubenswrapper[4983]: I0316 00:16:05.179064 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-65dr5"] Mar 16 00:16:05 crc kubenswrapper[4983]: I0316 00:16:05.185526 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-65dr5"] Mar 16 00:16:05 crc kubenswrapper[4983]: I0316 00:16:05.187142 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" event={"ID":"b56bb064-30c4-4aaf-a4d2-c81006425b62","Type":"ContainerDied","Data":"62abfec3868b503dddcadd449b49bba04126f6906e655f7ab31e9614fffc7705"} Mar 16 00:16:05 crc kubenswrapper[4983]: I0316 00:16:05.187177 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62abfec3868b503dddcadd449b49bba04126f6906e655f7ab31e9614fffc7705" Mar 16 00:16:05 crc kubenswrapper[4983]: I0316 00:16:05.187185 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:06 crc kubenswrapper[4983]: I0316 00:16:06.101564 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" path="/var/lib/kubelet/pods/c39b8480-5521-4ff7-b6ec-4f67009b1f5c/volumes" Mar 16 00:16:23 crc kubenswrapper[4983]: I0316 00:16:23.448701 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:16:23 crc kubenswrapper[4983]: I0316 00:16:23.449279 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:16:53 crc kubenswrapper[4983]: I0316 00:16:53.448526 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:16:53 crc kubenswrapper[4983]: I0316 00:16:53.449112 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.447901 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.448391 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.448431 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.448942 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e056a1a5ca459a72f9b9d946a266c3a0b85c436266062d48d109316238bb9f2f"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.448990 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://e056a1a5ca459a72f9b9d946a266c3a0b85c436266062d48d109316238bb9f2f" gracePeriod=600 Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.680674 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="e056a1a5ca459a72f9b9d946a266c3a0b85c436266062d48d109316238bb9f2f" exitCode=0 Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.680736 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"e056a1a5ca459a72f9b9d946a266c3a0b85c436266062d48d109316238bb9f2f"} Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.680956 4983 scope.go:117] "RemoveContainer" containerID="a285b65caa99c1c0ba0c4deb9dc06267b724d77153088ef275808d94e8acc41c" Mar 16 00:17:24 crc kubenswrapper[4983]: I0316 00:17:24.692160 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"c46c4de7c35ace23617ad378a775dc0cdbe9c0cb791abead202e26dd6d103d18"} Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.137787 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560338-2jkpl"] Mar 16 00:18:00 crc kubenswrapper[4983]: E0316 00:18:00.138713 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56bb064-30c4-4aaf-a4d2-c81006425b62" containerName="oc" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.138737 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56bb064-30c4-4aaf-a4d2-c81006425b62" containerName="oc" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.138956 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="b56bb064-30c4-4aaf-a4d2-c81006425b62" containerName="oc" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.139556 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.142135 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.143045 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-2jkpl"] Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.143952 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.144098 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.271451 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdwld\" (UniqueName: \"kubernetes.io/projected/1c6e333f-fadd-4c92-8db1-b9a923850fa0-kube-api-access-jdwld\") pod \"auto-csr-approver-29560338-2jkpl\" (UID: \"1c6e333f-fadd-4c92-8db1-b9a923850fa0\") " pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.372471 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdwld\" (UniqueName: \"kubernetes.io/projected/1c6e333f-fadd-4c92-8db1-b9a923850fa0-kube-api-access-jdwld\") pod \"auto-csr-approver-29560338-2jkpl\" (UID: \"1c6e333f-fadd-4c92-8db1-b9a923850fa0\") " pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.403968 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdwld\" (UniqueName: \"kubernetes.io/projected/1c6e333f-fadd-4c92-8db1-b9a923850fa0-kube-api-access-jdwld\") pod \"auto-csr-approver-29560338-2jkpl\" (UID: \"1c6e333f-fadd-4c92-8db1-b9a923850fa0\") " pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.468842 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.690153 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-2jkpl"] Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.911828 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" event={"ID":"1c6e333f-fadd-4c92-8db1-b9a923850fa0","Type":"ContainerStarted","Data":"a7cd1304998263bded23ae2ca04dc03222cab0d730e80dcbefaa82e2d971d2df"} Mar 16 00:18:02 crc kubenswrapper[4983]: I0316 00:18:02.930442 4983 generic.go:334] "Generic (PLEG): container finished" podID="1c6e333f-fadd-4c92-8db1-b9a923850fa0" containerID="e40b8ff2ea2fe096fb51ca5ef76f5eab03f687249bde3326f40974dcfd1c4938" exitCode=0 Mar 16 00:18:02 crc kubenswrapper[4983]: I0316 00:18:02.930532 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" event={"ID":"1c6e333f-fadd-4c92-8db1-b9a923850fa0","Type":"ContainerDied","Data":"e40b8ff2ea2fe096fb51ca5ef76f5eab03f687249bde3326f40974dcfd1c4938"} Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.116589 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.227786 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdwld\" (UniqueName: \"kubernetes.io/projected/1c6e333f-fadd-4c92-8db1-b9a923850fa0-kube-api-access-jdwld\") pod \"1c6e333f-fadd-4c92-8db1-b9a923850fa0\" (UID: \"1c6e333f-fadd-4c92-8db1-b9a923850fa0\") " Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.240321 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6e333f-fadd-4c92-8db1-b9a923850fa0-kube-api-access-jdwld" (OuterVolumeSpecName: "kube-api-access-jdwld") pod "1c6e333f-fadd-4c92-8db1-b9a923850fa0" (UID: "1c6e333f-fadd-4c92-8db1-b9a923850fa0"). InnerVolumeSpecName "kube-api-access-jdwld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.329656 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdwld\" (UniqueName: \"kubernetes.io/projected/1c6e333f-fadd-4c92-8db1-b9a923850fa0-kube-api-access-jdwld\") on node \"crc\" DevicePath \"\"" Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.942688 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" event={"ID":"1c6e333f-fadd-4c92-8db1-b9a923850fa0","Type":"ContainerDied","Data":"a7cd1304998263bded23ae2ca04dc03222cab0d730e80dcbefaa82e2d971d2df"} Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.942744 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7cd1304998263bded23ae2ca04dc03222cab0d730e80dcbefaa82e2d971d2df" Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.942879 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:05 crc kubenswrapper[4983]: I0316 00:18:05.175001 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-pflh5"] Mar 16 00:18:05 crc kubenswrapper[4983]: I0316 00:18:05.182531 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-pflh5"] Mar 16 00:18:05 crc kubenswrapper[4983]: I0316 00:18:05.501458 4983 scope.go:117] "RemoveContainer" containerID="76d2b798a64d4809150e865ba49cceb6346042cb22c2796d78469f6cd57fde6c" Mar 16 00:18:05 crc kubenswrapper[4983]: I0316 00:18:05.530970 4983 scope.go:117] "RemoveContainer" containerID="f1d9cd29662f3f229511dac637df41ff7b782921910c342dbfa3015d6466b383" Mar 16 00:18:06 crc kubenswrapper[4983]: I0316 00:18:06.104802 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90d9cc10-08aa-485e-a7cd-305a3e316c39" path="/var/lib/kubelet/pods/90d9cc10-08aa-485e-a7cd-305a3e316c39/volumes" Mar 16 00:19:05 crc kubenswrapper[4983]: I0316 00:19:05.824364 4983 scope.go:117] "RemoveContainer" containerID="0e3f6e1e6221d6bd922f567a1feb21e97e8062170d3d8a1f33f38076de2dd3b8" Mar 16 00:19:05 crc kubenswrapper[4983]: I0316 00:19:05.903158 4983 scope.go:117] "RemoveContainer" containerID="81ae8785e149353406399189ed21a1fda919c310e54aefe671726be28185c2ae" Mar 16 00:19:23 crc kubenswrapper[4983]: I0316 00:19:23.447948 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:19:23 crc kubenswrapper[4983]: I0316 00:19:23.450061 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.442866 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wsfb4"] Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.444220 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-controller" containerID="cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.444308 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.444308 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-node" containerID="cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.444600 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="northd" containerID="cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.444618 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="sbdb" containerID="cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.444390 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-acl-logging" containerID="cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.445111 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="nbdb" containerID="cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.482040 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.482040 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.486145 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.488033 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.488111 4983 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="sbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.489940 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.492936 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" containerID="cri-o://7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.493030 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.493117 4983 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="nbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.730593 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/3.log" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.732696 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovn-acl-logging/0.log" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.733171 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovn-controller/0.log" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.733573 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.785646 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b29wv"] Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.785889 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-node" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.785909 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-node" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.785924 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="sbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.785932 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="sbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.785943 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="nbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.785951 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="nbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.785962 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6e333f-fadd-4c92-8db1-b9a923850fa0" containerName="oc" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.785970 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6e333f-fadd-4c92-8db1-b9a923850fa0" containerName="oc" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.785982 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.785990 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.785999 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786006 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786015 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786021 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786033 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786040 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786051 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kubecfg-setup" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786058 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kubecfg-setup" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786070 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-ovn-metrics" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786078 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-ovn-metrics" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786090 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="northd" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786098 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="northd" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786109 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-acl-logging" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786116 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-acl-logging" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786223 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786235 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786247 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786255 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-ovn-metrics" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786265 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="northd" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786274 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="nbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786283 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="sbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786291 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786299 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-node" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786315 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-acl-logging" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786324 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6e333f-fadd-4c92-8db1-b9a923850fa0" containerName="oc" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786434 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786444 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786455 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786463 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786573 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786586 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.789413 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.884923 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-etc-openvswitch\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.884970 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-node-log\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.884994 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-openvswitch\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885013 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-netns\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885037 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-kubelet\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885062 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88s5k\" (UniqueName: \"kubernetes.io/projected/f055dad5-7c9b-46a1-a715-34847c30d0cf-kube-api-access-88s5k\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885088 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-log-socket\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885110 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885116 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885141 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-var-lib-openvswitch\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885181 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885186 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885217 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-log-socket" (OuterVolumeSpecName: "log-socket") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885216 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-node-log" (OuterVolumeSpecName: "node-log") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885247 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885237 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-ovn-kubernetes\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885289 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-bin\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885259 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885313 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-ovn\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885227 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885266 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885337 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-slash\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885362 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-config\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885375 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885376 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885385 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-script-lib\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885405 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-slash" (OuterVolumeSpecName: "host-slash") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885452 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovn-node-metrics-cert\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885489 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-env-overrides\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885633 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-netd\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885681 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885697 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-systemd\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885714 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-systemd-units\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885876 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885993 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886007 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886158 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-var-lib-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886186 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-kubelet\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886203 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-systemd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886250 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm9hd\" (UniqueName: \"kubernetes.io/projected/728b696b-af39-40e1-9f49-eb3f9ab1f87d-kube-api-access-nm9hd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886268 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-systemd-units\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886285 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886296 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886349 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-log-socket\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886366 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-env-overrides\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886391 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886413 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-cni-bin\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886438 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-node-log\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886476 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovnkube-script-lib\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886551 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-etc-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886589 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-run-netns\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886718 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-ovn\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886829 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-cni-netd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886933 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-slash\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886984 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovn-node-metrics-cert\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887082 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovnkube-config\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887133 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-run-ovn-kubernetes\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887277 4983 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887307 4983 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887327 4983 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-node-log\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887346 4983 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887363 4983 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887379 4983 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887397 4983 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-log-socket\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887415 4983 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887437 4983 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887459 4983 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887477 4983 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887492 4983 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887509 4983 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-slash\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887525 4983 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887541 4983 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887561 4983 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887577 4983 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.891241 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.891339 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f055dad5-7c9b-46a1-a715-34847c30d0cf-kube-api-access-88s5k" (OuterVolumeSpecName: "kube-api-access-88s5k") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "kube-api-access-88s5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.898262 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988587 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-etc-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988641 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-run-netns\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988675 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-ovn\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988701 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-cni-netd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988730 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-slash\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988767 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovn-node-metrics-cert\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988795 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovnkube-config\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988815 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-run-ovn-kubernetes\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988851 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-var-lib-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988867 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-ovn\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988877 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-kubelet\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988867 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-slash\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988899 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-systemd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988908 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-var-lib-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988930 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-systemd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988939 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm9hd\" (UniqueName: \"kubernetes.io/projected/728b696b-af39-40e1-9f49-eb3f9ab1f87d-kube-api-access-nm9hd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988813 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-etc-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989001 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-cni-netd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989010 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-systemd-units\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988898 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-run-ovn-kubernetes\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988830 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-run-netns\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989077 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-systemd-units\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989079 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989175 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-log-socket\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988960 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-kubelet\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989113 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989194 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-env-overrides\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989251 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-log-socket\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989362 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989392 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-cni-bin\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989451 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989455 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-cni-bin\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989494 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-node-log\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989523 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-node-log\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989546 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovnkube-script-lib\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989943 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-env-overrides\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.990278 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovnkube-script-lib\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989637 4983 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.990334 4983 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.990347 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88s5k\" (UniqueName: \"kubernetes.io/projected/f055dad5-7c9b-46a1-a715-34847c30d0cf-kube-api-access-88s5k\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.990368 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovnkube-config\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.993706 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovn-node-metrics-cert\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.009462 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm9hd\" (UniqueName: \"kubernetes.io/projected/728b696b-af39-40e1-9f49-eb3f9ab1f87d-kube-api-access-nm9hd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.102468 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.490740 4983 generic.go:334] "Generic (PLEG): container finished" podID="728b696b-af39-40e1-9f49-eb3f9ab1f87d" containerID="23bf7874921506f9febc4cf6cbd0f358df99a2c8a12ee98a60f0365637a382da" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.490817 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerDied","Data":"23bf7874921506f9febc4cf6cbd0f358df99a2c8a12ee98a60f0365637a382da"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.491109 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"a1c8c5159e9520ea4c6fa20ab275e7ae7cb25edb60436880e7ad5d9a31900897"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.493227 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/3.log" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.495581 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovn-acl-logging/0.log" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496098 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovn-controller/0.log" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496554 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496588 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496607 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496621 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496625 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496631 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496640 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496650 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496654 4983 scope.go:117] "RemoveContainer" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496640 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496703 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496719 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496658 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" exitCode=143 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496731 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496741 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" exitCode=143 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496743 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496790 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496667 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496798 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496923 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496940 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496948 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496960 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496967 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496974 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496994 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497018 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497028 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497035 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497041 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497047 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497053 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497060 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497067 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497074 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497082 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497092 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497103 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497112 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497120 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497127 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497134 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497140 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497147 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497153 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497160 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497167 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497177 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"a0c448e461e6c3fe1b265793bab80821f3f4c31a789d62361e918982254116d6"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497189 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497198 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497205 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497212 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497218 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497225 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497232 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497238 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497245 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497253 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.498421 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/2.log" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.498781 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/1.log" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.498812 4983 generic.go:334] "Generic (PLEG): container finished" podID="f81ec143-6c51-4f96-ae71-a4759bac7c70" containerID="1ce990601ab37c57875d72edbed61342c2686f343314ce5c6375afee78bda6da" exitCode=2 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.498835 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerDied","Data":"1ce990601ab37c57875d72edbed61342c2686f343314ce5c6375afee78bda6da"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.498851 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.499157 4983 scope.go:117] "RemoveContainer" containerID="1ce990601ab37c57875d72edbed61342c2686f343314ce5c6375afee78bda6da" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.499334 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-tqncp_openshift-multus(f81ec143-6c51-4f96-ae71-a4759bac7c70)\"" pod="openshift-multus/multus-tqncp" podUID="f81ec143-6c51-4f96-ae71-a4759bac7c70" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.524653 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.540387 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wsfb4"] Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.544072 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wsfb4"] Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.563061 4983 scope.go:117] "RemoveContainer" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.607813 4983 scope.go:117] "RemoveContainer" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.621710 4983 scope.go:117] "RemoveContainer" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.636092 4983 scope.go:117] "RemoveContainer" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.649375 4983 scope.go:117] "RemoveContainer" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.661348 4983 scope.go:117] "RemoveContainer" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.710562 4983 scope.go:117] "RemoveContainer" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.726519 4983 scope.go:117] "RemoveContainer" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.742257 4983 scope.go:117] "RemoveContainer" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.742852 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": container with ID starting with 7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8 not found: ID does not exist" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.742897 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} err="failed to get container status \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": rpc error: code = NotFound desc = could not find container \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": container with ID starting with 7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.742929 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.743350 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": container with ID starting with a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca not found: ID does not exist" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.743404 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} err="failed to get container status \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": rpc error: code = NotFound desc = could not find container \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": container with ID starting with a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.743442 4983 scope.go:117] "RemoveContainer" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.744047 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": container with ID starting with 15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765 not found: ID does not exist" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.744405 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} err="failed to get container status \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": rpc error: code = NotFound desc = could not find container \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": container with ID starting with 15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.744422 4983 scope.go:117] "RemoveContainer" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.744835 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": container with ID starting with f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad not found: ID does not exist" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.744875 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} err="failed to get container status \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": rpc error: code = NotFound desc = could not find container \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": container with ID starting with f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.744896 4983 scope.go:117] "RemoveContainer" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.745490 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": container with ID starting with a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1 not found: ID does not exist" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.745520 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} err="failed to get container status \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": rpc error: code = NotFound desc = could not find container \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": container with ID starting with a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.745537 4983 scope.go:117] "RemoveContainer" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.745836 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": container with ID starting with 5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c not found: ID does not exist" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.745863 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} err="failed to get container status \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": rpc error: code = NotFound desc = could not find container \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": container with ID starting with 5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.745879 4983 scope.go:117] "RemoveContainer" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.746176 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": container with ID starting with cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61 not found: ID does not exist" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.746207 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} err="failed to get container status \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": rpc error: code = NotFound desc = could not find container \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": container with ID starting with cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.746222 4983 scope.go:117] "RemoveContainer" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.746625 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": container with ID starting with f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab not found: ID does not exist" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.746654 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} err="failed to get container status \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": rpc error: code = NotFound desc = could not find container \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": container with ID starting with f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.746676 4983 scope.go:117] "RemoveContainer" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.747089 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": container with ID starting with 903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144 not found: ID does not exist" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.747127 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} err="failed to get container status \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": rpc error: code = NotFound desc = could not find container \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": container with ID starting with 903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.747146 4983 scope.go:117] "RemoveContainer" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.747509 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": container with ID starting with 294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a not found: ID does not exist" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.747532 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} err="failed to get container status \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": rpc error: code = NotFound desc = could not find container \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": container with ID starting with 294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.747546 4983 scope.go:117] "RemoveContainer" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.747949 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} err="failed to get container status \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": rpc error: code = NotFound desc = could not find container \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": container with ID starting with 7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.747967 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.748243 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} err="failed to get container status \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": rpc error: code = NotFound desc = could not find container \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": container with ID starting with a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.748257 4983 scope.go:117] "RemoveContainer" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.748503 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} err="failed to get container status \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": rpc error: code = NotFound desc = could not find container \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": container with ID starting with 15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.748521 4983 scope.go:117] "RemoveContainer" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.749472 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} err="failed to get container status \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": rpc error: code = NotFound desc = could not find container \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": container with ID starting with f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.749492 4983 scope.go:117] "RemoveContainer" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.749911 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} err="failed to get container status \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": rpc error: code = NotFound desc = could not find container \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": container with ID starting with a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.749936 4983 scope.go:117] "RemoveContainer" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.750271 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} err="failed to get container status \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": rpc error: code = NotFound desc = could not find container \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": container with ID starting with 5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.750292 4983 scope.go:117] "RemoveContainer" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.750546 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} err="failed to get container status \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": rpc error: code = NotFound desc = could not find container \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": container with ID starting with cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.750565 4983 scope.go:117] "RemoveContainer" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.750805 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} err="failed to get container status \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": rpc error: code = NotFound desc = could not find container \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": container with ID starting with f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.750828 4983 scope.go:117] "RemoveContainer" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.751088 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} err="failed to get container status \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": rpc error: code = NotFound desc = could not find container \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": container with ID starting with 903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.751108 4983 scope.go:117] "RemoveContainer" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.751420 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} err="failed to get container status \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": rpc error: code = NotFound desc = could not find container \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": container with ID starting with 294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.751438 4983 scope.go:117] "RemoveContainer" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.751669 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} err="failed to get container status \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": rpc error: code = NotFound desc = could not find container \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": container with ID starting with 7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.751686 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.752190 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} err="failed to get container status \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": rpc error: code = NotFound desc = could not find container \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": container with ID starting with a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.752213 4983 scope.go:117] "RemoveContainer" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.752660 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} err="failed to get container status \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": rpc error: code = NotFound desc = could not find container \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": container with ID starting with 15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.752681 4983 scope.go:117] "RemoveContainer" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.752915 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} err="failed to get container status \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": rpc error: code = NotFound desc = could not find container \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": container with ID starting with f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753113 4983 scope.go:117] "RemoveContainer" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753380 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} err="failed to get container status \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": rpc error: code = NotFound desc = could not find container \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": container with ID starting with a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753400 4983 scope.go:117] "RemoveContainer" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753646 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} err="failed to get container status \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": rpc error: code = NotFound desc = could not find container \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": container with ID starting with 5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753666 4983 scope.go:117] "RemoveContainer" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753935 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} err="failed to get container status \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": rpc error: code = NotFound desc = could not find container \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": container with ID starting with cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753986 4983 scope.go:117] "RemoveContainer" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.754384 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} err="failed to get container status \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": rpc error: code = NotFound desc = could not find container \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": container with ID starting with f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.754407 4983 scope.go:117] "RemoveContainer" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.754695 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} err="failed to get container status \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": rpc error: code = NotFound desc = could not find container \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": container with ID starting with 903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.754897 4983 scope.go:117] "RemoveContainer" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.755257 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} err="failed to get container status \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": rpc error: code = NotFound desc = could not find container \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": container with ID starting with 294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.755277 4983 scope.go:117] "RemoveContainer" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.755612 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} err="failed to get container status \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": rpc error: code = NotFound desc = could not find container \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": container with ID starting with 7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.755643 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.756108 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} err="failed to get container status \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": rpc error: code = NotFound desc = could not find container \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": container with ID starting with a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.756127 4983 scope.go:117] "RemoveContainer" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.756578 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} err="failed to get container status \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": rpc error: code = NotFound desc = could not find container \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": container with ID starting with 15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.756606 4983 scope.go:117] "RemoveContainer" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757072 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} err="failed to get container status \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": rpc error: code = NotFound desc = could not find container \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": container with ID starting with f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757096 4983 scope.go:117] "RemoveContainer" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757327 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} err="failed to get container status \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": rpc error: code = NotFound desc = could not find container \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": container with ID starting with a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757345 4983 scope.go:117] "RemoveContainer" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757568 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} err="failed to get container status \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": rpc error: code = NotFound desc = could not find container \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": container with ID starting with 5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757590 4983 scope.go:117] "RemoveContainer" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757842 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} err="failed to get container status \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": rpc error: code = NotFound desc = could not find container \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": container with ID starting with cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757868 4983 scope.go:117] "RemoveContainer" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758084 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} err="failed to get container status \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": rpc error: code = NotFound desc = could not find container \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": container with ID starting with f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758103 4983 scope.go:117] "RemoveContainer" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758307 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} err="failed to get container status \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": rpc error: code = NotFound desc = could not find container \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": container with ID starting with 903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758324 4983 scope.go:117] "RemoveContainer" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758595 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} err="failed to get container status \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": rpc error: code = NotFound desc = could not find container \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": container with ID starting with 294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758625 4983 scope.go:117] "RemoveContainer" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758920 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} err="failed to get container status \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": rpc error: code = NotFound desc = could not find container \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": container with ID starting with 7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8 not found: ID does not exist" Mar 16 00:19:35 crc kubenswrapper[4983]: I0316 00:19:35.514076 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"cafac4abaa5ea78029d0f81189400f2c7e33e0a3af5cf98c710f7c4b17f2726e"} Mar 16 00:19:35 crc kubenswrapper[4983]: I0316 00:19:35.514142 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"85f6ac2df96f9517579723f4b30522cd4829b387dce48dfe326c701bf2c145b8"} Mar 16 00:19:35 crc kubenswrapper[4983]: I0316 00:19:35.514239 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"aafbb1172845e63d001271f1bc0b1c8aa3f051bdc2764bf44a65e63399919e17"} Mar 16 00:19:35 crc kubenswrapper[4983]: I0316 00:19:35.514307 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"0d3ab15df6e440889eed1141da42ab71266ef7b836c68ee8422d06493316f458"} Mar 16 00:19:35 crc kubenswrapper[4983]: I0316 00:19:35.514334 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"89005a22df996694d53a88a98c1e3d6e8aec458923c8e8e2bfa6b91a1a70bd39"} Mar 16 00:19:35 crc kubenswrapper[4983]: I0316 00:19:35.514353 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"281e191e6c680a52386c56c840df4b1e717fe00d47ab0983bb24eb4c2092330c"} Mar 16 00:19:36 crc kubenswrapper[4983]: I0316 00:19:36.105805 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" path="/var/lib/kubelet/pods/f055dad5-7c9b-46a1-a715-34847c30d0cf/volumes" Mar 16 00:19:37 crc kubenswrapper[4983]: I0316 00:19:37.531369 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"c58a78bdbc722c72145803dc2cbd3b9ef82cbca5dbce9a3d2b91a594548c0d95"} Mar 16 00:19:40 crc kubenswrapper[4983]: I0316 00:19:40.559153 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"8c353e4739fa5108c73a78b99d1acc8c55f379b1c1bfab213c85b7962a208d16"} Mar 16 00:19:40 crc kubenswrapper[4983]: I0316 00:19:40.561707 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:40 crc kubenswrapper[4983]: I0316 00:19:40.561726 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:40 crc kubenswrapper[4983]: I0316 00:19:40.589617 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:40 crc kubenswrapper[4983]: I0316 00:19:40.599481 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" podStartSLOduration=7.59944868 podStartE2EDuration="7.59944868s" podCreationTimestamp="2026-03-16 00:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:19:40.588706219 +0000 UTC m=+789.188804689" watchObservedRunningTime="2026-03-16 00:19:40.59944868 +0000 UTC m=+789.199547150" Mar 16 00:19:41 crc kubenswrapper[4983]: I0316 00:19:41.566806 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:41 crc kubenswrapper[4983]: I0316 00:19:41.630416 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:47 crc kubenswrapper[4983]: I0316 00:19:47.092704 4983 scope.go:117] "RemoveContainer" containerID="1ce990601ab37c57875d72edbed61342c2686f343314ce5c6375afee78bda6da" Mar 16 00:19:47 crc kubenswrapper[4983]: I0316 00:19:47.607524 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/2.log" Mar 16 00:19:47 crc kubenswrapper[4983]: I0316 00:19:47.608160 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/1.log" Mar 16 00:19:47 crc kubenswrapper[4983]: I0316 00:19:47.608239 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerStarted","Data":"d802bbc6780abcefaa729b7a4287774d5b04d7688b5a98ef4e274499eb75f8ea"} Mar 16 00:19:53 crc kubenswrapper[4983]: I0316 00:19:53.448938 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:19:53 crc kubenswrapper[4983]: I0316 00:19:53.449553 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.220150 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560340-664mq"] Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.221306 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-664mq"] Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.221407 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.226872 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.227031 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.226877 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.327378 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79clq\" (UniqueName: \"kubernetes.io/projected/3356aa9a-4f16-4602-97b0-1118f7e55776-kube-api-access-79clq\") pod \"auto-csr-approver-29560340-664mq\" (UID: \"3356aa9a-4f16-4602-97b0-1118f7e55776\") " pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.429345 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79clq\" (UniqueName: \"kubernetes.io/projected/3356aa9a-4f16-4602-97b0-1118f7e55776-kube-api-access-79clq\") pod \"auto-csr-approver-29560340-664mq\" (UID: \"3356aa9a-4f16-4602-97b0-1118f7e55776\") " pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.454352 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79clq\" (UniqueName: \"kubernetes.io/projected/3356aa9a-4f16-4602-97b0-1118f7e55776-kube-api-access-79clq\") pod \"auto-csr-approver-29560340-664mq\" (UID: \"3356aa9a-4f16-4602-97b0-1118f7e55776\") " pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.540535 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.754990 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-664mq"] Mar 16 00:20:00 crc kubenswrapper[4983]: W0316 00:20:00.766090 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3356aa9a_4f16_4602_97b0_1118f7e55776.slice/crio-fbcaec4b97f3b18cc867c2775ac9da5a85a4983b8d3fd788b4a601a1a6734720 WatchSource:0}: Error finding container fbcaec4b97f3b18cc867c2775ac9da5a85a4983b8d3fd788b4a601a1a6734720: Status 404 returned error can't find the container with id fbcaec4b97f3b18cc867c2775ac9da5a85a4983b8d3fd788b4a601a1a6734720 Mar 16 00:20:01 crc kubenswrapper[4983]: I0316 00:20:01.709323 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560340-664mq" event={"ID":"3356aa9a-4f16-4602-97b0-1118f7e55776","Type":"ContainerStarted","Data":"fbcaec4b97f3b18cc867c2775ac9da5a85a4983b8d3fd788b4a601a1a6734720"} Mar 16 00:20:03 crc kubenswrapper[4983]: I0316 00:20:03.721350 4983 generic.go:334] "Generic (PLEG): container finished" podID="3356aa9a-4f16-4602-97b0-1118f7e55776" containerID="ec962f764e58dc18fb35bd2bf73250ec727cbdfcfdec0a585462238f6e2032c9" exitCode=0 Mar 16 00:20:03 crc kubenswrapper[4983]: I0316 00:20:03.721388 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560340-664mq" event={"ID":"3356aa9a-4f16-4602-97b0-1118f7e55776","Type":"ContainerDied","Data":"ec962f764e58dc18fb35bd2bf73250ec727cbdfcfdec0a585462238f6e2032c9"} Mar 16 00:20:04 crc kubenswrapper[4983]: I0316 00:20:04.123311 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.022323 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.091155 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79clq\" (UniqueName: \"kubernetes.io/projected/3356aa9a-4f16-4602-97b0-1118f7e55776-kube-api-access-79clq\") pod \"3356aa9a-4f16-4602-97b0-1118f7e55776\" (UID: \"3356aa9a-4f16-4602-97b0-1118f7e55776\") " Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.098643 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3356aa9a-4f16-4602-97b0-1118f7e55776-kube-api-access-79clq" (OuterVolumeSpecName: "kube-api-access-79clq") pod "3356aa9a-4f16-4602-97b0-1118f7e55776" (UID: "3356aa9a-4f16-4602-97b0-1118f7e55776"). InnerVolumeSpecName "kube-api-access-79clq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.192210 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79clq\" (UniqueName: \"kubernetes.io/projected/3356aa9a-4f16-4602-97b0-1118f7e55776-kube-api-access-79clq\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.737259 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560340-664mq" event={"ID":"3356aa9a-4f16-4602-97b0-1118f7e55776","Type":"ContainerDied","Data":"fbcaec4b97f3b18cc867c2775ac9da5a85a4983b8d3fd788b4a601a1a6734720"} Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.737588 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbcaec4b97f3b18cc867c2775ac9da5a85a4983b8d3fd788b4a601a1a6734720" Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.737467 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.946707 4983 scope.go:117] "RemoveContainer" containerID="dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9" Mar 16 00:20:06 crc kubenswrapper[4983]: I0316 00:20:06.069256 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-5n4gc"] Mar 16 00:20:06 crc kubenswrapper[4983]: I0316 00:20:06.072406 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-5n4gc"] Mar 16 00:20:06 crc kubenswrapper[4983]: I0316 00:20:06.100188 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="272489bc-7bd4-4421-930d-150816da83b8" path="/var/lib/kubelet/pods/272489bc-7bd4-4421-930d-150816da83b8/volumes" Mar 16 00:20:06 crc kubenswrapper[4983]: I0316 00:20:06.744829 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/2.log" Mar 16 00:20:10 crc kubenswrapper[4983]: I0316 00:20:10.969248 4983 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.432105 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgt5w"] Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.434170 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hgt5w" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="registry-server" containerID="cri-o://f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" gracePeriod=30 Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.512353 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010 is running failed: container process not found" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.512836 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010 is running failed: container process not found" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.513266 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010 is running failed: container process not found" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.513309 4983 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-hgt5w" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="registry-server" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.776712 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.819528 4983 generic.go:334] "Generic (PLEG): container finished" podID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" exitCode=0 Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.819575 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgt5w" event={"ID":"b4cf6a4e-082d-473f-8640-b1eb9b6591d2","Type":"ContainerDied","Data":"f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010"} Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.819605 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgt5w" event={"ID":"b4cf6a4e-082d-473f-8640-b1eb9b6591d2","Type":"ContainerDied","Data":"3bdf27aed1d212bcce22928b3d69b78bec030ff348adbec53a55366b6c0d3541"} Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.819623 4983 scope.go:117] "RemoveContainer" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.819741 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.845066 4983 scope.go:117] "RemoveContainer" containerID="361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.863025 4983 scope.go:117] "RemoveContainer" containerID="0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.880966 4983 scope.go:117] "RemoveContainer" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.881464 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010\": container with ID starting with f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010 not found: ID does not exist" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.881498 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010"} err="failed to get container status \"f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010\": rpc error: code = NotFound desc = could not find container \"f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010\": container with ID starting with f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010 not found: ID does not exist" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.881518 4983 scope.go:117] "RemoveContainer" containerID="361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d" Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.882340 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d\": container with ID starting with 361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d not found: ID does not exist" containerID="361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.882396 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d"} err="failed to get container status \"361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d\": rpc error: code = NotFound desc = could not find container \"361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d\": container with ID starting with 361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d not found: ID does not exist" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.882432 4983 scope.go:117] "RemoveContainer" containerID="0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50" Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.882837 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50\": container with ID starting with 0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50 not found: ID does not exist" containerID="0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.882874 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50"} err="failed to get container status \"0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50\": rpc error: code = NotFound desc = could not find container \"0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50\": container with ID starting with 0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50 not found: ID does not exist" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.956983 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-catalog-content\") pod \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.957066 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2snn8\" (UniqueName: \"kubernetes.io/projected/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-kube-api-access-2snn8\") pod \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.957105 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-utilities\") pod \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.958247 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-utilities" (OuterVolumeSpecName: "utilities") pod "b4cf6a4e-082d-473f-8640-b1eb9b6591d2" (UID: "b4cf6a4e-082d-473f-8640-b1eb9b6591d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.961768 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-kube-api-access-2snn8" (OuterVolumeSpecName: "kube-api-access-2snn8") pod "b4cf6a4e-082d-473f-8640-b1eb9b6591d2" (UID: "b4cf6a4e-082d-473f-8640-b1eb9b6591d2"). InnerVolumeSpecName "kube-api-access-2snn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.989267 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4cf6a4e-082d-473f-8640-b1eb9b6591d2" (UID: "b4cf6a4e-082d-473f-8640-b1eb9b6591d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:20 crc kubenswrapper[4983]: I0316 00:20:20.058989 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:20 crc kubenswrapper[4983]: I0316 00:20:20.059055 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2snn8\" (UniqueName: \"kubernetes.io/projected/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-kube-api-access-2snn8\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:20 crc kubenswrapper[4983]: I0316 00:20:20.059087 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:20 crc kubenswrapper[4983]: I0316 00:20:20.158246 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgt5w"] Mar 16 00:20:20 crc kubenswrapper[4983]: I0316 00:20:20.163802 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgt5w"] Mar 16 00:20:22 crc kubenswrapper[4983]: I0316 00:20:22.100934 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" path="/var/lib/kubelet/pods/b4cf6a4e-082d-473f-8640-b1eb9b6591d2/volumes" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331252 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7"] Mar 16 00:20:23 crc kubenswrapper[4983]: E0316 00:20:23.331443 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="registry-server" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331454 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="registry-server" Mar 16 00:20:23 crc kubenswrapper[4983]: E0316 00:20:23.331468 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="extract-content" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331474 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="extract-content" Mar 16 00:20:23 crc kubenswrapper[4983]: E0316 00:20:23.331489 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3356aa9a-4f16-4602-97b0-1118f7e55776" containerName="oc" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331495 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="3356aa9a-4f16-4602-97b0-1118f7e55776" containerName="oc" Mar 16 00:20:23 crc kubenswrapper[4983]: E0316 00:20:23.331504 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="extract-utilities" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331510 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="extract-utilities" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331601 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="registry-server" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331611 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="3356aa9a-4f16-4602-97b0-1118f7e55776" containerName="oc" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.332255 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.334428 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.341361 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7"] Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.414642 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.414708 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsbd7\" (UniqueName: \"kubernetes.io/projected/d4e5d5e8-e64e-4876-a604-976485b93449-kube-api-access-tsbd7\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.414959 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.447824 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.447887 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.447935 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.448528 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c46c4de7c35ace23617ad378a775dc0cdbe9c0cb791abead202e26dd6d103d18"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.448596 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://c46c4de7c35ace23617ad378a775dc0cdbe9c0cb791abead202e26dd6d103d18" gracePeriod=600 Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.515519 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.515584 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.515613 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsbd7\" (UniqueName: \"kubernetes.io/projected/d4e5d5e8-e64e-4876-a604-976485b93449-kube-api-access-tsbd7\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.516196 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.516534 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.539713 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsbd7\" (UniqueName: \"kubernetes.io/projected/d4e5d5e8-e64e-4876-a604-976485b93449-kube-api-access-tsbd7\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.660144 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.846196 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="c46c4de7c35ace23617ad378a775dc0cdbe9c0cb791abead202e26dd6d103d18" exitCode=0 Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.846282 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"c46c4de7c35ace23617ad378a775dc0cdbe9c0cb791abead202e26dd6d103d18"} Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.846554 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"46c022992a1c1aeeb47c6d405474573b981c1ce7a0658e8eab3f5cf112a6afc5"} Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.846577 4983 scope.go:117] "RemoveContainer" containerID="e056a1a5ca459a72f9b9d946a266c3a0b85c436266062d48d109316238bb9f2f" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.853637 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7"] Mar 16 00:20:23 crc kubenswrapper[4983]: W0316 00:20:23.864226 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4e5d5e8_e64e_4876_a604_976485b93449.slice/crio-be943c076adacabef341397ce15419ddde21b92a1cc39e897f1732c8c8e99f58 WatchSource:0}: Error finding container be943c076adacabef341397ce15419ddde21b92a1cc39e897f1732c8c8e99f58: Status 404 returned error can't find the container with id be943c076adacabef341397ce15419ddde21b92a1cc39e897f1732c8c8e99f58 Mar 16 00:20:24 crc kubenswrapper[4983]: I0316 00:20:24.855187 4983 generic.go:334] "Generic (PLEG): container finished" podID="d4e5d5e8-e64e-4876-a604-976485b93449" containerID="4038284b7308d1921454212a0595d9066728744e6bdac74a76e70712f46efdca" exitCode=0 Mar 16 00:20:24 crc kubenswrapper[4983]: I0316 00:20:24.855298 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" event={"ID":"d4e5d5e8-e64e-4876-a604-976485b93449","Type":"ContainerDied","Data":"4038284b7308d1921454212a0595d9066728744e6bdac74a76e70712f46efdca"} Mar 16 00:20:24 crc kubenswrapper[4983]: I0316 00:20:24.855534 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" event={"ID":"d4e5d5e8-e64e-4876-a604-976485b93449","Type":"ContainerStarted","Data":"be943c076adacabef341397ce15419ddde21b92a1cc39e897f1732c8c8e99f58"} Mar 16 00:20:25 crc kubenswrapper[4983]: I0316 00:20:25.873473 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" event={"ID":"d4e5d5e8-e64e-4876-a604-976485b93449","Type":"ContainerStarted","Data":"634138fc4f69897b591d293e0d6fada5b8c0f16866e672765d9c538b04bfc7af"} Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.494162 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-246kv"] Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.495464 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.510030 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-246kv"] Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.554422 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-utilities\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.554737 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch4cs\" (UniqueName: \"kubernetes.io/projected/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-kube-api-access-ch4cs\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.554864 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-catalog-content\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.655446 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-utilities\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.655504 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch4cs\" (UniqueName: \"kubernetes.io/projected/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-kube-api-access-ch4cs\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.655543 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-catalog-content\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.656305 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-utilities\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.656418 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-catalog-content\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.673869 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch4cs\" (UniqueName: \"kubernetes.io/projected/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-kube-api-access-ch4cs\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.863278 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.879103 4983 generic.go:334] "Generic (PLEG): container finished" podID="d4e5d5e8-e64e-4876-a604-976485b93449" containerID="634138fc4f69897b591d293e0d6fada5b8c0f16866e672765d9c538b04bfc7af" exitCode=0 Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.879337 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" event={"ID":"d4e5d5e8-e64e-4876-a604-976485b93449","Type":"ContainerDied","Data":"634138fc4f69897b591d293e0d6fada5b8c0f16866e672765d9c538b04bfc7af"} Mar 16 00:20:27 crc kubenswrapper[4983]: I0316 00:20:27.072350 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-246kv"] Mar 16 00:20:27 crc kubenswrapper[4983]: W0316 00:20:27.087979 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8fefe96_0a7d_4f0c_ad4c_9ddb1573f5eb.slice/crio-189ad11b896e38a9ec020acccd5db23277edbb2d4b0baea800bc4896e8ea296a WatchSource:0}: Error finding container 189ad11b896e38a9ec020acccd5db23277edbb2d4b0baea800bc4896e8ea296a: Status 404 returned error can't find the container with id 189ad11b896e38a9ec020acccd5db23277edbb2d4b0baea800bc4896e8ea296a Mar 16 00:20:27 crc kubenswrapper[4983]: I0316 00:20:27.885027 4983 generic.go:334] "Generic (PLEG): container finished" podID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerID="cf4b26abd512b4e809f852ba2adfc32b6dc9094135d78aae3f567c7db9c58b4d" exitCode=0 Mar 16 00:20:27 crc kubenswrapper[4983]: I0316 00:20:27.885149 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerDied","Data":"cf4b26abd512b4e809f852ba2adfc32b6dc9094135d78aae3f567c7db9c58b4d"} Mar 16 00:20:27 crc kubenswrapper[4983]: I0316 00:20:27.885451 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerStarted","Data":"189ad11b896e38a9ec020acccd5db23277edbb2d4b0baea800bc4896e8ea296a"} Mar 16 00:20:27 crc kubenswrapper[4983]: I0316 00:20:27.888525 4983 generic.go:334] "Generic (PLEG): container finished" podID="d4e5d5e8-e64e-4876-a604-976485b93449" containerID="e46cd813755a717f51c264df7d3f3ee959849a5fecdfac5897bf8ae16155d088" exitCode=0 Mar 16 00:20:27 crc kubenswrapper[4983]: I0316 00:20:27.888559 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" event={"ID":"d4e5d5e8-e64e-4876-a604-976485b93449","Type":"ContainerDied","Data":"e46cd813755a717f51c264df7d3f3ee959849a5fecdfac5897bf8ae16155d088"} Mar 16 00:20:28 crc kubenswrapper[4983]: I0316 00:20:28.896351 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerStarted","Data":"7b01106210ef02c5fc4b2b479cfcc4510c0e3a3e5038d3f280ee033403a51f3c"} Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.167205 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.285385 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-util\") pod \"d4e5d5e8-e64e-4876-a604-976485b93449\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.285486 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-bundle\") pod \"d4e5d5e8-e64e-4876-a604-976485b93449\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.285532 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsbd7\" (UniqueName: \"kubernetes.io/projected/d4e5d5e8-e64e-4876-a604-976485b93449-kube-api-access-tsbd7\") pod \"d4e5d5e8-e64e-4876-a604-976485b93449\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.287967 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-bundle" (OuterVolumeSpecName: "bundle") pod "d4e5d5e8-e64e-4876-a604-976485b93449" (UID: "d4e5d5e8-e64e-4876-a604-976485b93449"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.295992 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e5d5e8-e64e-4876-a604-976485b93449-kube-api-access-tsbd7" (OuterVolumeSpecName: "kube-api-access-tsbd7") pod "d4e5d5e8-e64e-4876-a604-976485b93449" (UID: "d4e5d5e8-e64e-4876-a604-976485b93449"). InnerVolumeSpecName "kube-api-access-tsbd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.309540 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-util" (OuterVolumeSpecName: "util") pod "d4e5d5e8-e64e-4876-a604-976485b93449" (UID: "d4e5d5e8-e64e-4876-a604-976485b93449"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.386880 4983 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.386935 4983 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.386946 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsbd7\" (UniqueName: \"kubernetes.io/projected/d4e5d5e8-e64e-4876-a604-976485b93449-kube-api-access-tsbd7\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.904947 4983 generic.go:334] "Generic (PLEG): container finished" podID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerID="7b01106210ef02c5fc4b2b479cfcc4510c0e3a3e5038d3f280ee033403a51f3c" exitCode=0 Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.905058 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerDied","Data":"7b01106210ef02c5fc4b2b479cfcc4510c0e3a3e5038d3f280ee033403a51f3c"} Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.910902 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" event={"ID":"d4e5d5e8-e64e-4876-a604-976485b93449","Type":"ContainerDied","Data":"be943c076adacabef341397ce15419ddde21b92a1cc39e897f1732c8c8e99f58"} Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.910939 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be943c076adacabef341397ce15419ddde21b92a1cc39e897f1732c8c8e99f58" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.910970 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.115436 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4"] Mar 16 00:20:30 crc kubenswrapper[4983]: E0316 00:20:30.115623 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="extract" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.115635 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="extract" Mar 16 00:20:30 crc kubenswrapper[4983]: E0316 00:20:30.115655 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="util" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.115661 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="util" Mar 16 00:20:30 crc kubenswrapper[4983]: E0316 00:20:30.115671 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="pull" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.115677 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="pull" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.115792 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="extract" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.116462 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.120336 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.146486 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4"] Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.298554 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.298704 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.298740 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz9w5\" (UniqueName: \"kubernetes.io/projected/48256dd4-332f-4a25-a535-4357e3b8eccb-kube-api-access-zz9w5\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.400031 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.400094 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz9w5\" (UniqueName: \"kubernetes.io/projected/48256dd4-332f-4a25-a535-4357e3b8eccb-kube-api-access-zz9w5\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.400179 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.400911 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.401032 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.423081 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz9w5\" (UniqueName: \"kubernetes.io/projected/48256dd4-332f-4a25-a535-4357e3b8eccb-kube-api-access-zz9w5\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.469435 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.652175 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4"] Mar 16 00:20:30 crc kubenswrapper[4983]: W0316 00:20:30.660895 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48256dd4_332f_4a25_a535_4357e3b8eccb.slice/crio-e8fefd7f948a9835f21fcb69e40f4410a62e65982dac1b17c77dbb0c68469a8f WatchSource:0}: Error finding container e8fefd7f948a9835f21fcb69e40f4410a62e65982dac1b17c77dbb0c68469a8f: Status 404 returned error can't find the container with id e8fefd7f948a9835f21fcb69e40f4410a62e65982dac1b17c77dbb0c68469a8f Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.923798 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x"] Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.925405 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.930163 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x"] Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.935476 4983 generic.go:334] "Generic (PLEG): container finished" podID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerID="37eb925f51cce8332a0d3416e69ecaa63d5963330725cae68d19889ffc78eb0d" exitCode=0 Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.935565 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" event={"ID":"48256dd4-332f-4a25-a535-4357e3b8eccb","Type":"ContainerDied","Data":"37eb925f51cce8332a0d3416e69ecaa63d5963330725cae68d19889ffc78eb0d"} Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.935599 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" event={"ID":"48256dd4-332f-4a25-a535-4357e3b8eccb","Type":"ContainerStarted","Data":"e8fefd7f948a9835f21fcb69e40f4410a62e65982dac1b17c77dbb0c68469a8f"} Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.940460 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerStarted","Data":"3b8ac524d21ce5b4ca39d7d424ad8e7fa2a9ecfa77a2cbd3d9c646eb838316b7"} Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.971353 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-246kv" podStartSLOduration=2.538387461 podStartE2EDuration="4.971335593s" podCreationTimestamp="2026-03-16 00:20:26 +0000 UTC" firstStartedPulling="2026-03-16 00:20:27.88713968 +0000 UTC m=+836.487238110" lastFinishedPulling="2026-03-16 00:20:30.320087822 +0000 UTC m=+838.920186242" observedRunningTime="2026-03-16 00:20:30.967474302 +0000 UTC m=+839.567572732" watchObservedRunningTime="2026-03-16 00:20:30.971335593 +0000 UTC m=+839.571434023" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.008150 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.008283 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.008349 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l47j8\" (UniqueName: \"kubernetes.io/projected/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-kube-api-access-l47j8\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.109587 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l47j8\" (UniqueName: \"kubernetes.io/projected/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-kube-api-access-l47j8\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.109864 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.109994 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.110631 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.111014 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.132184 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l47j8\" (UniqueName: \"kubernetes.io/projected/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-kube-api-access-l47j8\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.254067 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.464864 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x"] Mar 16 00:20:31 crc kubenswrapper[4983]: W0316 00:20:31.469602 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8092d7d9_1bb8_44ce_bad9_4f36ba75b349.slice/crio-45279d172fef6a2e038b6b71ce4fd26e82f50e6e6a99a74bb90861b6dc6e8a00 WatchSource:0}: Error finding container 45279d172fef6a2e038b6b71ce4fd26e82f50e6e6a99a74bb90861b6dc6e8a00: Status 404 returned error can't find the container with id 45279d172fef6a2e038b6b71ce4fd26e82f50e6e6a99a74bb90861b6dc6e8a00 Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.946169 4983 generic.go:334] "Generic (PLEG): container finished" podID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerID="b2eedd04857c1bfe63cf7a147ece277e81ccf9d3fd9d5f5dedd1e310bf405781" exitCode=0 Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.946267 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" event={"ID":"8092d7d9-1bb8-44ce-bad9-4f36ba75b349","Type":"ContainerDied","Data":"b2eedd04857c1bfe63cf7a147ece277e81ccf9d3fd9d5f5dedd1e310bf405781"} Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.946503 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" event={"ID":"8092d7d9-1bb8-44ce-bad9-4f36ba75b349","Type":"ContainerStarted","Data":"45279d172fef6a2e038b6b71ce4fd26e82f50e6e6a99a74bb90861b6dc6e8a00"} Mar 16 00:20:32 crc kubenswrapper[4983]: I0316 00:20:32.952913 4983 generic.go:334] "Generic (PLEG): container finished" podID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerID="c4fb851d36c9925cb4f5a450a8ba4c9f9f89ad6c5b9a79d22a1b0e42f3820a1c" exitCode=0 Mar 16 00:20:32 crc kubenswrapper[4983]: I0316 00:20:32.953025 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" event={"ID":"48256dd4-332f-4a25-a535-4357e3b8eccb","Type":"ContainerDied","Data":"c4fb851d36c9925cb4f5a450a8ba4c9f9f89ad6c5b9a79d22a1b0e42f3820a1c"} Mar 16 00:20:32 crc kubenswrapper[4983]: I0316 00:20:32.954581 4983 generic.go:334] "Generic (PLEG): container finished" podID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerID="5da96aafe7ab4e3aff6e9a3e1302bf9ac0b47550440914c2e8ad72b3f672453e" exitCode=0 Mar 16 00:20:32 crc kubenswrapper[4983]: I0316 00:20:32.954606 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" event={"ID":"8092d7d9-1bb8-44ce-bad9-4f36ba75b349","Type":"ContainerDied","Data":"5da96aafe7ab4e3aff6e9a3e1302bf9ac0b47550440914c2e8ad72b3f672453e"} Mar 16 00:20:33 crc kubenswrapper[4983]: I0316 00:20:33.962040 4983 generic.go:334] "Generic (PLEG): container finished" podID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerID="87ae9077a4806cbec3b568db675d0545fefc2a16e8da481d47eb9d7b5ee01c52" exitCode=0 Mar 16 00:20:33 crc kubenswrapper[4983]: I0316 00:20:33.962117 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" event={"ID":"48256dd4-332f-4a25-a535-4357e3b8eccb","Type":"ContainerDied","Data":"87ae9077a4806cbec3b568db675d0545fefc2a16e8da481d47eb9d7b5ee01c52"} Mar 16 00:20:33 crc kubenswrapper[4983]: I0316 00:20:33.964490 4983 generic.go:334] "Generic (PLEG): container finished" podID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerID="6eedef20352e4aad3a2355f26948e4452aa05f0087f89db2be1b74ea509461c1" exitCode=0 Mar 16 00:20:33 crc kubenswrapper[4983]: I0316 00:20:33.964515 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" event={"ID":"8092d7d9-1bb8-44ce-bad9-4f36ba75b349","Type":"ContainerDied","Data":"6eedef20352e4aad3a2355f26948e4452aa05f0087f89db2be1b74ea509461c1"} Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.410862 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.473549 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576384 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-bundle\") pod \"48256dd4-332f-4a25-a535-4357e3b8eccb\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576437 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-bundle\") pod \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576460 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-util\") pod \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576494 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz9w5\" (UniqueName: \"kubernetes.io/projected/48256dd4-332f-4a25-a535-4357e3b8eccb-kube-api-access-zz9w5\") pod \"48256dd4-332f-4a25-a535-4357e3b8eccb\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576513 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-util\") pod \"48256dd4-332f-4a25-a535-4357e3b8eccb\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576548 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l47j8\" (UniqueName: \"kubernetes.io/projected/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-kube-api-access-l47j8\") pod \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576984 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-bundle" (OuterVolumeSpecName: "bundle") pod "48256dd4-332f-4a25-a535-4357e3b8eccb" (UID: "48256dd4-332f-4a25-a535-4357e3b8eccb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.577336 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-bundle" (OuterVolumeSpecName: "bundle") pod "8092d7d9-1bb8-44ce-bad9-4f36ba75b349" (UID: "8092d7d9-1bb8-44ce-bad9-4f36ba75b349"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.584959 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-kube-api-access-l47j8" (OuterVolumeSpecName: "kube-api-access-l47j8") pod "8092d7d9-1bb8-44ce-bad9-4f36ba75b349" (UID: "8092d7d9-1bb8-44ce-bad9-4f36ba75b349"). InnerVolumeSpecName "kube-api-access-l47j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.595937 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48256dd4-332f-4a25-a535-4357e3b8eccb-kube-api-access-zz9w5" (OuterVolumeSpecName: "kube-api-access-zz9w5") pod "48256dd4-332f-4a25-a535-4357e3b8eccb" (UID: "48256dd4-332f-4a25-a535-4357e3b8eccb"). InnerVolumeSpecName "kube-api-access-zz9w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.613133 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-util" (OuterVolumeSpecName: "util") pod "8092d7d9-1bb8-44ce-bad9-4f36ba75b349" (UID: "8092d7d9-1bb8-44ce-bad9-4f36ba75b349"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.613841 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-util" (OuterVolumeSpecName: "util") pod "48256dd4-332f-4a25-a535-4357e3b8eccb" (UID: "48256dd4-332f-4a25-a535-4357e3b8eccb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.678073 4983 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.678110 4983 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.678119 4983 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.678128 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz9w5\" (UniqueName: \"kubernetes.io/projected/48256dd4-332f-4a25-a535-4357e3b8eccb-kube-api-access-zz9w5\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.678140 4983 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.678150 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l47j8\" (UniqueName: \"kubernetes.io/projected/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-kube-api-access-l47j8\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700444 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l677b"] Mar 16 00:20:35 crc kubenswrapper[4983]: E0316 00:20:35.700650 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="util" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700662 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="util" Mar 16 00:20:35 crc kubenswrapper[4983]: E0316 00:20:35.700671 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="extract" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700677 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="extract" Mar 16 00:20:35 crc kubenswrapper[4983]: E0316 00:20:35.700688 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="extract" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700694 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="extract" Mar 16 00:20:35 crc kubenswrapper[4983]: E0316 00:20:35.700709 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="util" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700715 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="util" Mar 16 00:20:35 crc kubenswrapper[4983]: E0316 00:20:35.700723 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="pull" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700728 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="pull" Mar 16 00:20:35 crc kubenswrapper[4983]: E0316 00:20:35.700738 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="pull" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700743 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="pull" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700838 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="extract" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700853 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="extract" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.701513 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.720511 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l677b"] Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.779113 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-utilities\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.779188 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-catalog-content\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.779212 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhpst\" (UniqueName: \"kubernetes.io/projected/e93de2c7-8794-463c-9a2d-ac74246f35b7-kube-api-access-xhpst\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.880084 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-utilities\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.880145 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-catalog-content\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.880172 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhpst\" (UniqueName: \"kubernetes.io/projected/e93de2c7-8794-463c-9a2d-ac74246f35b7-kube-api-access-xhpst\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.880676 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-utilities\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.880721 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-catalog-content\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.904782 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhpst\" (UniqueName: \"kubernetes.io/projected/e93de2c7-8794-463c-9a2d-ac74246f35b7-kube-api-access-xhpst\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.976087 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" event={"ID":"8092d7d9-1bb8-44ce-bad9-4f36ba75b349","Type":"ContainerDied","Data":"45279d172fef6a2e038b6b71ce4fd26e82f50e6e6a99a74bb90861b6dc6e8a00"} Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.976128 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45279d172fef6a2e038b6b71ce4fd26e82f50e6e6a99a74bb90861b6dc6e8a00" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.976131 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.978296 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" event={"ID":"48256dd4-332f-4a25-a535-4357e3b8eccb","Type":"ContainerDied","Data":"e8fefd7f948a9835f21fcb69e40f4410a62e65982dac1b17c77dbb0c68469a8f"} Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.978333 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8fefd7f948a9835f21fcb69e40f4410a62e65982dac1b17c77dbb0c68469a8f" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.978398 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.012519 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.331251 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l677b"] Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.863849 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.863898 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.984284 4983 generic.go:334] "Generic (PLEG): container finished" podID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerID="fdae44ae256e839c75602b525bcc23b96273c95335b8e9ad6fa6615a4eb894ee" exitCode=0 Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.984383 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerDied","Data":"fdae44ae256e839c75602b525bcc23b96273c95335b8e9ad6fa6615a4eb894ee"} Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.984864 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerStarted","Data":"2ca390185601a7ed62fc960bc8af54221f40dfad67cd1fd3da407d672363b944"} Mar 16 00:20:37 crc kubenswrapper[4983]: I0316 00:20:37.921145 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-246kv" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="registry-server" probeResult="failure" output=< Mar 16 00:20:37 crc kubenswrapper[4983]: timeout: failed to connect service ":50051" within 1s Mar 16 00:20:37 crc kubenswrapper[4983]: > Mar 16 00:20:37 crc kubenswrapper[4983]: I0316 00:20:37.991304 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerStarted","Data":"b8812e102e4a81f30e5ce1c0f485f8bff4418d56c5b0c04d3dec108534a29084"} Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.933043 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm"] Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.933979 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.936551 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.940088 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-696hx\" (UniqueName: \"kubernetes.io/projected/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-kube-api-access-696hx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.940156 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.940382 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.952851 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm"] Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.998918 4983 generic.go:334] "Generic (PLEG): container finished" podID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerID="b8812e102e4a81f30e5ce1c0f485f8bff4418d56c5b0c04d3dec108534a29084" exitCode=0 Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.998966 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerDied","Data":"b8812e102e4a81f30e5ce1c0f485f8bff4418d56c5b0c04d3dec108534a29084"} Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.042016 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.042283 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-696hx\" (UniqueName: \"kubernetes.io/projected/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-kube-api-access-696hx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.042372 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.042640 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.042744 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.064791 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-696hx\" (UniqueName: \"kubernetes.io/projected/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-kube-api-access-696hx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.250648 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.497451 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm"] Mar 16 00:20:39 crc kubenswrapper[4983]: W0316 00:20:39.511546 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd45ab45_645e_45d3_a9eb_a3d1392b5f7a.slice/crio-d7dd8b01edd106061e6d5e680eb573feed395612e08671515bd58dfbd61f800d WatchSource:0}: Error finding container d7dd8b01edd106061e6d5e680eb573feed395612e08671515bd58dfbd61f800d: Status 404 returned error can't find the container with id d7dd8b01edd106061e6d5e680eb573feed395612e08671515bd58dfbd61f800d Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.006896 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerStarted","Data":"ca70d91557ffd10bcdac32d57f09d3183f7bd43b81d72ecd5d63691b5e1903c2"} Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.008624 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" event={"ID":"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a","Type":"ContainerStarted","Data":"5be1a7319c157e061cc029f579f1321a2ab9f725df9d59cd85cfdeb7f85614c8"} Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.008650 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" event={"ID":"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a","Type":"ContainerStarted","Data":"d7dd8b01edd106061e6d5e680eb573feed395612e08671515bd58dfbd61f800d"} Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.026053 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l677b" podStartSLOduration=2.557308675 podStartE2EDuration="5.026039425s" podCreationTimestamp="2026-03-16 00:20:35 +0000 UTC" firstStartedPulling="2026-03-16 00:20:36.985835305 +0000 UTC m=+845.585933735" lastFinishedPulling="2026-03-16 00:20:39.454566045 +0000 UTC m=+848.054664485" observedRunningTime="2026-03-16 00:20:40.02549264 +0000 UTC m=+848.625591070" watchObservedRunningTime="2026-03-16 00:20:40.026039425 +0000 UTC m=+848.626137845" Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.903892 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8"] Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.905213 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.907193 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-qbx62" Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.907404 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.913908 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8"] Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.915926 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.014880 4983 generic.go:334] "Generic (PLEG): container finished" podID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerID="5be1a7319c157e061cc029f579f1321a2ab9f725df9d59cd85cfdeb7f85614c8" exitCode=0 Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.015641 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" event={"ID":"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a","Type":"ContainerDied","Data":"5be1a7319c157e061cc029f579f1321a2ab9f725df9d59cd85cfdeb7f85614c8"} Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.021835 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.023167 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.025533 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-z2zf6" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.037047 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.037782 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.038219 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.052824 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.055082 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.077442 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj89v\" (UniqueName: \"kubernetes.io/projected/2af5ec54-bcc4-45f5-839a-135da91513a2-kube-api-access-rj89v\") pod \"obo-prometheus-operator-68bc856cb9-sn6x8\" (UID: \"2af5ec54-bcc4-45f5-839a-135da91513a2\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.135892 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-c99mb"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.136570 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.138456 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.138894 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-slwg2" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.153629 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-c99mb"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.183940 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj89v\" (UniqueName: \"kubernetes.io/projected/2af5ec54-bcc4-45f5-839a-135da91513a2-kube-api-access-rj89v\") pod \"obo-prometheus-operator-68bc856cb9-sn6x8\" (UID: \"2af5ec54-bcc4-45f5-839a-135da91513a2\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.184019 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e065fa9-405e-452b-bfe7-c4920a8577db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd\" (UID: \"7e065fa9-405e-452b-bfe7-c4920a8577db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.184109 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/30d188b9-ab98-47a3-8143-3f58ae611dd6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-kx26j\" (UID: \"30d188b9-ab98-47a3-8143-3f58ae611dd6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.184128 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/30d188b9-ab98-47a3-8143-3f58ae611dd6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-kx26j\" (UID: \"30d188b9-ab98-47a3-8143-3f58ae611dd6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.184151 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e065fa9-405e-452b-bfe7-c4920a8577db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd\" (UID: \"7e065fa9-405e-452b-bfe7-c4920a8577db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.208532 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj89v\" (UniqueName: \"kubernetes.io/projected/2af5ec54-bcc4-45f5-839a-135da91513a2-kube-api-access-rj89v\") pod \"obo-prometheus-operator-68bc856cb9-sn6x8\" (UID: \"2af5ec54-bcc4-45f5-839a-135da91513a2\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.276786 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.285628 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7drq\" (UniqueName: \"kubernetes.io/projected/05523d68-53d9-4cc5-a02b-5221a2396606-kube-api-access-x7drq\") pod \"observability-operator-59bdc8b94-c99mb\" (UID: \"05523d68-53d9-4cc5-a02b-5221a2396606\") " pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.285842 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/30d188b9-ab98-47a3-8143-3f58ae611dd6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-kx26j\" (UID: \"30d188b9-ab98-47a3-8143-3f58ae611dd6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.285885 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/30d188b9-ab98-47a3-8143-3f58ae611dd6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-kx26j\" (UID: \"30d188b9-ab98-47a3-8143-3f58ae611dd6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.285945 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e065fa9-405e-452b-bfe7-c4920a8577db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd\" (UID: \"7e065fa9-405e-452b-bfe7-c4920a8577db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.285983 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/05523d68-53d9-4cc5-a02b-5221a2396606-observability-operator-tls\") pod \"observability-operator-59bdc8b94-c99mb\" (UID: \"05523d68-53d9-4cc5-a02b-5221a2396606\") " pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.286072 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e065fa9-405e-452b-bfe7-c4920a8577db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd\" (UID: \"7e065fa9-405e-452b-bfe7-c4920a8577db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.290573 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e065fa9-405e-452b-bfe7-c4920a8577db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd\" (UID: \"7e065fa9-405e-452b-bfe7-c4920a8577db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.291311 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/30d188b9-ab98-47a3-8143-3f58ae611dd6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-kx26j\" (UID: \"30d188b9-ab98-47a3-8143-3f58ae611dd6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.291312 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/30d188b9-ab98-47a3-8143-3f58ae611dd6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-kx26j\" (UID: \"30d188b9-ab98-47a3-8143-3f58ae611dd6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.293926 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e065fa9-405e-452b-bfe7-c4920a8577db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd\" (UID: \"7e065fa9-405e-452b-bfe7-c4920a8577db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.340355 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.344241 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dmdpt"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.353236 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dmdpt"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.353353 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.355271 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.356895 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-qzjzq" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.387188 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/05523d68-53d9-4cc5-a02b-5221a2396606-observability-operator-tls\") pod \"observability-operator-59bdc8b94-c99mb\" (UID: \"05523d68-53d9-4cc5-a02b-5221a2396606\") " pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.387262 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7drq\" (UniqueName: \"kubernetes.io/projected/05523d68-53d9-4cc5-a02b-5221a2396606-kube-api-access-x7drq\") pod \"observability-operator-59bdc8b94-c99mb\" (UID: \"05523d68-53d9-4cc5-a02b-5221a2396606\") " pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.392667 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/05523d68-53d9-4cc5-a02b-5221a2396606-observability-operator-tls\") pod \"observability-operator-59bdc8b94-c99mb\" (UID: \"05523d68-53d9-4cc5-a02b-5221a2396606\") " pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.417684 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7drq\" (UniqueName: \"kubernetes.io/projected/05523d68-53d9-4cc5-a02b-5221a2396606-kube-api-access-x7drq\") pod \"observability-operator-59bdc8b94-c99mb\" (UID: \"05523d68-53d9-4cc5-a02b-5221a2396606\") " pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.453673 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.488470 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbpxn\" (UniqueName: \"kubernetes.io/projected/8eb6b056-16ea-46db-b8ea-fd17a717a8e4-kube-api-access-zbpxn\") pod \"perses-operator-5bf474d74f-dmdpt\" (UID: \"8eb6b056-16ea-46db-b8ea-fd17a717a8e4\") " pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.488523 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb6b056-16ea-46db-b8ea-fd17a717a8e4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dmdpt\" (UID: \"8eb6b056-16ea-46db-b8ea-fd17a717a8e4\") " pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.590258 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbpxn\" (UniqueName: \"kubernetes.io/projected/8eb6b056-16ea-46db-b8ea-fd17a717a8e4-kube-api-access-zbpxn\") pod \"perses-operator-5bf474d74f-dmdpt\" (UID: \"8eb6b056-16ea-46db-b8ea-fd17a717a8e4\") " pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.590303 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb6b056-16ea-46db-b8ea-fd17a717a8e4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dmdpt\" (UID: \"8eb6b056-16ea-46db-b8ea-fd17a717a8e4\") " pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.592655 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb6b056-16ea-46db-b8ea-fd17a717a8e4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dmdpt\" (UID: \"8eb6b056-16ea-46db-b8ea-fd17a717a8e4\") " pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.623586 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbpxn\" (UniqueName: \"kubernetes.io/projected/8eb6b056-16ea-46db-b8ea-fd17a717a8e4-kube-api-access-zbpxn\") pod \"perses-operator-5bf474d74f-dmdpt\" (UID: \"8eb6b056-16ea-46db-b8ea-fd17a717a8e4\") " pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.709889 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.718619 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.801110 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-c99mb"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.843671 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.852534 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8"] Mar 16 00:20:41 crc kubenswrapper[4983]: W0316 00:20:41.855577 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e065fa9_405e_452b_bfe7_c4920a8577db.slice/crio-9277385c2fbc793cbe6eb1c23d7a7ec7cdae2a2c3bd1193732f8676c914e8201 WatchSource:0}: Error finding container 9277385c2fbc793cbe6eb1c23d7a7ec7cdae2a2c3bd1193732f8676c914e8201: Status 404 returned error can't find the container with id 9277385c2fbc793cbe6eb1c23d7a7ec7cdae2a2c3bd1193732f8676c914e8201 Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.940093 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dmdpt"] Mar 16 00:20:42 crc kubenswrapper[4983]: I0316 00:20:42.024857 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" event={"ID":"2af5ec54-bcc4-45f5-839a-135da91513a2","Type":"ContainerStarted","Data":"027cd6429254dd17ee667c31575d9e723fe4b02dbf7690e108c0f0f84d5046b5"} Mar 16 00:20:42 crc kubenswrapper[4983]: I0316 00:20:42.026345 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" event={"ID":"7e065fa9-405e-452b-bfe7-c4920a8577db","Type":"ContainerStarted","Data":"9277385c2fbc793cbe6eb1c23d7a7ec7cdae2a2c3bd1193732f8676c914e8201"} Mar 16 00:20:42 crc kubenswrapper[4983]: I0316 00:20:42.027272 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" event={"ID":"30d188b9-ab98-47a3-8143-3f58ae611dd6","Type":"ContainerStarted","Data":"1c7aa08faa1e0c4d71702c30fee2135c0edeb9c7749dedf3a525b4d9ba16acc0"} Mar 16 00:20:42 crc kubenswrapper[4983]: I0316 00:20:42.028282 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" event={"ID":"8eb6b056-16ea-46db-b8ea-fd17a717a8e4","Type":"ContainerStarted","Data":"bd56ccc15d253d16cfb3130753611c862d5e0f5dc093b529a4a32320d273acdf"} Mar 16 00:20:42 crc kubenswrapper[4983]: I0316 00:20:42.029150 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" event={"ID":"05523d68-53d9-4cc5-a02b-5221a2396606","Type":"ContainerStarted","Data":"855e71bf1bdda4a6f1e450402d524f07bf3fb40ee40240144e479978a5770db0"} Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.810941 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-hdnm6"] Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.812012 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.815713 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.815968 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-2gs2c" Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.819542 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.821507 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-hdnm6"] Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.946518 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsfb7\" (UniqueName: \"kubernetes.io/projected/5f1c8286-7638-43ad-bfec-fe7210fa4d73-kube-api-access-tsfb7\") pod \"interconnect-operator-5bb49f789d-hdnm6\" (UID: \"5f1c8286-7638-43ad-bfec-fe7210fa4d73\") " pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" Mar 16 00:20:45 crc kubenswrapper[4983]: I0316 00:20:45.049686 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsfb7\" (UniqueName: \"kubernetes.io/projected/5f1c8286-7638-43ad-bfec-fe7210fa4d73-kube-api-access-tsfb7\") pod \"interconnect-operator-5bb49f789d-hdnm6\" (UID: \"5f1c8286-7638-43ad-bfec-fe7210fa4d73\") " pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" Mar 16 00:20:45 crc kubenswrapper[4983]: I0316 00:20:45.099820 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsfb7\" (UniqueName: \"kubernetes.io/projected/5f1c8286-7638-43ad-bfec-fe7210fa4d73-kube-api-access-tsfb7\") pod \"interconnect-operator-5bb49f789d-hdnm6\" (UID: \"5f1c8286-7638-43ad-bfec-fe7210fa4d73\") " pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" Mar 16 00:20:45 crc kubenswrapper[4983]: I0316 00:20:45.141982 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" Mar 16 00:20:46 crc kubenswrapper[4983]: I0316 00:20:46.013041 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:46 crc kubenswrapper[4983]: I0316 00:20:46.018010 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:46 crc kubenswrapper[4983]: I0316 00:20:46.078934 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:46 crc kubenswrapper[4983]: I0316 00:20:46.209425 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:46 crc kubenswrapper[4983]: I0316 00:20:46.915624 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:46 crc kubenswrapper[4983]: I0316 00:20:46.966538 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.806963 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-b96d44b59-tbkm6"] Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.807619 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.809564 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.809591 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-bxd4d" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.829152 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-b96d44b59-tbkm6"] Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.889765 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7872b362-5118-4058-abba-048e0a81ecff-webhook-cert\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.889819 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7872b362-5118-4058-abba-048e0a81ecff-apiservice-cert\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.889863 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf9sl\" (UniqueName: \"kubernetes.io/projected/7872b362-5118-4058-abba-048e0a81ecff-kube-api-access-vf9sl\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.991103 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7872b362-5118-4058-abba-048e0a81ecff-webhook-cert\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.991165 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7872b362-5118-4058-abba-048e0a81ecff-apiservice-cert\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.991212 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf9sl\" (UniqueName: \"kubernetes.io/projected/7872b362-5118-4058-abba-048e0a81ecff-kube-api-access-vf9sl\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.997829 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7872b362-5118-4058-abba-048e0a81ecff-webhook-cert\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.998404 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7872b362-5118-4058-abba-048e0a81ecff-apiservice-cert\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:48 crc kubenswrapper[4983]: I0316 00:20:48.015688 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf9sl\" (UniqueName: \"kubernetes.io/projected/7872b362-5118-4058-abba-048e0a81ecff-kube-api-access-vf9sl\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:48 crc kubenswrapper[4983]: I0316 00:20:48.136766 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:49 crc kubenswrapper[4983]: I0316 00:20:49.885389 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l677b"] Mar 16 00:20:49 crc kubenswrapper[4983]: I0316 00:20:49.887285 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l677b" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="registry-server" containerID="cri-o://ca70d91557ffd10bcdac32d57f09d3183f7bd43b81d72ecd5d63691b5e1903c2" gracePeriod=2 Mar 16 00:20:50 crc kubenswrapper[4983]: I0316 00:20:50.168711 4983 generic.go:334] "Generic (PLEG): container finished" podID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerID="ca70d91557ffd10bcdac32d57f09d3183f7bd43b81d72ecd5d63691b5e1903c2" exitCode=0 Mar 16 00:20:50 crc kubenswrapper[4983]: I0316 00:20:50.168782 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerDied","Data":"ca70d91557ffd10bcdac32d57f09d3183f7bd43b81d72ecd5d63691b5e1903c2"} Mar 16 00:20:51 crc kubenswrapper[4983]: I0316 00:20:51.084357 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-246kv"] Mar 16 00:20:51 crc kubenswrapper[4983]: I0316 00:20:51.084638 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-246kv" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="registry-server" containerID="cri-o://3b8ac524d21ce5b4ca39d7d424ad8e7fa2a9ecfa77a2cbd3d9c646eb838316b7" gracePeriod=2 Mar 16 00:20:52 crc kubenswrapper[4983]: I0316 00:20:52.192273 4983 generic.go:334] "Generic (PLEG): container finished" podID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerID="3b8ac524d21ce5b4ca39d7d424ad8e7fa2a9ecfa77a2cbd3d9c646eb838316b7" exitCode=0 Mar 16 00:20:52 crc kubenswrapper[4983]: I0316 00:20:52.192338 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerDied","Data":"3b8ac524d21ce5b4ca39d7d424ad8e7fa2a9ecfa77a2cbd3d9c646eb838316b7"} Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.253590 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.268694 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-hdnm6"] Mar 16 00:20:54 crc kubenswrapper[4983]: W0316 00:20:54.281929 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f1c8286_7638_43ad_bfec_fe7210fa4d73.slice/crio-fb954ad6b6c151e90996a0456f533d0f17575abf2506669743979dcbb92d5afd WatchSource:0}: Error finding container fb954ad6b6c151e90996a0456f533d0f17575abf2506669743979dcbb92d5afd: Status 404 returned error can't find the container with id fb954ad6b6c151e90996a0456f533d0f17575abf2506669743979dcbb92d5afd Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.288644 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.390686 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-utilities\") pod \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.390765 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-catalog-content\") pod \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.390813 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-utilities\") pod \"e93de2c7-8794-463c-9a2d-ac74246f35b7\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.390869 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-catalog-content\") pod \"e93de2c7-8794-463c-9a2d-ac74246f35b7\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.390907 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch4cs\" (UniqueName: \"kubernetes.io/projected/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-kube-api-access-ch4cs\") pod \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.390922 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhpst\" (UniqueName: \"kubernetes.io/projected/e93de2c7-8794-463c-9a2d-ac74246f35b7-kube-api-access-xhpst\") pod \"e93de2c7-8794-463c-9a2d-ac74246f35b7\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.391472 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-utilities" (OuterVolumeSpecName: "utilities") pod "a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" (UID: "a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.391595 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-utilities" (OuterVolumeSpecName: "utilities") pod "e93de2c7-8794-463c-9a2d-ac74246f35b7" (UID: "e93de2c7-8794-463c-9a2d-ac74246f35b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.404991 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-kube-api-access-ch4cs" (OuterVolumeSpecName: "kube-api-access-ch4cs") pod "a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" (UID: "a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb"). InnerVolumeSpecName "kube-api-access-ch4cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.405947 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93de2c7-8794-463c-9a2d-ac74246f35b7-kube-api-access-xhpst" (OuterVolumeSpecName: "kube-api-access-xhpst") pod "e93de2c7-8794-463c-9a2d-ac74246f35b7" (UID: "e93de2c7-8794-463c-9a2d-ac74246f35b7"). InnerVolumeSpecName "kube-api-access-xhpst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.423172 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-b96d44b59-tbkm6"] Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.452354 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e93de2c7-8794-463c-9a2d-ac74246f35b7" (UID: "e93de2c7-8794-463c-9a2d-ac74246f35b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.492641 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.492677 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.492689 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch4cs\" (UniqueName: \"kubernetes.io/projected/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-kube-api-access-ch4cs\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.492701 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhpst\" (UniqueName: \"kubernetes.io/projected/e93de2c7-8794-463c-9a2d-ac74246f35b7-kube-api-access-xhpst\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.492712 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.525465 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" (UID: "a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.593817 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.220480 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerDied","Data":"2ca390185601a7ed62fc960bc8af54221f40dfad67cd1fd3da407d672363b944"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.220814 4983 scope.go:117] "RemoveContainer" containerID="ca70d91557ffd10bcdac32d57f09d3183f7bd43b81d72ecd5d63691b5e1903c2" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.220545 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.229660 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" event={"ID":"5f1c8286-7638-43ad-bfec-fe7210fa4d73","Type":"ContainerStarted","Data":"fb954ad6b6c151e90996a0456f533d0f17575abf2506669743979dcbb92d5afd"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.235061 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" event={"ID":"2af5ec54-bcc4-45f5-839a-135da91513a2","Type":"ContainerStarted","Data":"16f3a31594ed9eb1ebd9e260b07bcd275a1a8abfe9c17eedc8718739037a5e19"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.239741 4983 generic.go:334] "Generic (PLEG): container finished" podID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerID="5e7d0dbf615618baf160d51d6d79b4dfa32050f849de2fc0ecd095a7b4d3e2ba" exitCode=0 Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.239813 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" event={"ID":"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a","Type":"ContainerDied","Data":"5e7d0dbf615618baf160d51d6d79b4dfa32050f849de2fc0ecd095a7b4d3e2ba"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.243902 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" event={"ID":"8eb6b056-16ea-46db-b8ea-fd17a717a8e4","Type":"ContainerStarted","Data":"f88fc03862db31b62c9ab835ee03bb22c6aedc1454fb7f7afc015dbb8acf7663"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.244811 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.247910 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" event={"ID":"7872b362-5118-4058-abba-048e0a81ecff","Type":"ContainerStarted","Data":"94935f58532543c2baf38622cd3e455f27ee3fa634a67932b462375e9b856c00"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.248454 4983 scope.go:117] "RemoveContainer" containerID="b8812e102e4a81f30e5ce1c0f485f8bff4418d56c5b0c04d3dec108534a29084" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.251633 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" event={"ID":"05523d68-53d9-4cc5-a02b-5221a2396606","Type":"ContainerStarted","Data":"40839b5fc575fdad0097b730a2235961ba9936fd606461838603f07be3d289d7"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.252211 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.253907 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.256101 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" event={"ID":"7e065fa9-405e-452b-bfe7-c4920a8577db","Type":"ContainerStarted","Data":"2239846f8ee0496822dcc8e3af5f7aa7292d1c49670430ca478852c50a72b302"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.261581 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" podStartSLOduration=3.247600189 podStartE2EDuration="15.261561379s" podCreationTimestamp="2026-03-16 00:20:40 +0000 UTC" firstStartedPulling="2026-03-16 00:20:41.872486144 +0000 UTC m=+850.472584574" lastFinishedPulling="2026-03-16 00:20:53.886447334 +0000 UTC m=+862.486545764" observedRunningTime="2026-03-16 00:20:55.249571974 +0000 UTC m=+863.849670404" watchObservedRunningTime="2026-03-16 00:20:55.261561379 +0000 UTC m=+863.861659809" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.264929 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" event={"ID":"30d188b9-ab98-47a3-8143-3f58ae611dd6","Type":"ContainerStarted","Data":"ab05ea1fcb6362c94ba5714366dcb07eb754c5a462062be9fab0d670bbb77ad4"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.276251 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerDied","Data":"189ad11b896e38a9ec020acccd5db23277edbb2d4b0baea800bc4896e8ea296a"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.276351 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.292476 4983 scope.go:117] "RemoveContainer" containerID="fdae44ae256e839c75602b525bcc23b96273c95335b8e9ad6fa6615a4eb894ee" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.303699 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l677b"] Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.326504 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l677b"] Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.327100 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" podStartSLOduration=2.372363789 podStartE2EDuration="14.327088996s" podCreationTimestamp="2026-03-16 00:20:41 +0000 UTC" firstStartedPulling="2026-03-16 00:20:41.963699415 +0000 UTC m=+850.563797845" lastFinishedPulling="2026-03-16 00:20:53.918424622 +0000 UTC m=+862.518523052" observedRunningTime="2026-03-16 00:20:55.316219451 +0000 UTC m=+863.916317881" watchObservedRunningTime="2026-03-16 00:20:55.327088996 +0000 UTC m=+863.927187426" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.343595 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" podStartSLOduration=2.318380974 podStartE2EDuration="14.343573348s" podCreationTimestamp="2026-03-16 00:20:41 +0000 UTC" firstStartedPulling="2026-03-16 00:20:41.873875051 +0000 UTC m=+850.473973481" lastFinishedPulling="2026-03-16 00:20:53.899067425 +0000 UTC m=+862.499165855" observedRunningTime="2026-03-16 00:20:55.338391483 +0000 UTC m=+863.938489923" watchObservedRunningTime="2026-03-16 00:20:55.343573348 +0000 UTC m=+863.943671778" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.359491 4983 scope.go:117] "RemoveContainer" containerID="3b8ac524d21ce5b4ca39d7d424ad8e7fa2a9ecfa77a2cbd3d9c646eb838316b7" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.373896 4983 scope.go:117] "RemoveContainer" containerID="7b01106210ef02c5fc4b2b479cfcc4510c0e3a3e5038d3f280ee033403a51f3c" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.385492 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" podStartSLOduration=2.190860512 podStartE2EDuration="14.385474307s" podCreationTimestamp="2026-03-16 00:20:41 +0000 UTC" firstStartedPulling="2026-03-16 00:20:41.729681631 +0000 UTC m=+850.329780061" lastFinishedPulling="2026-03-16 00:20:53.924295426 +0000 UTC m=+862.524393856" observedRunningTime="2026-03-16 00:20:55.381859352 +0000 UTC m=+863.981957792" watchObservedRunningTime="2026-03-16 00:20:55.385474307 +0000 UTC m=+863.985572737" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.434189 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" podStartSLOduration=2.334999669 podStartE2EDuration="14.434168513s" podCreationTimestamp="2026-03-16 00:20:41 +0000 UTC" firstStartedPulling="2026-03-16 00:20:41.827402422 +0000 UTC m=+850.427500852" lastFinishedPulling="2026-03-16 00:20:53.926571266 +0000 UTC m=+862.526669696" observedRunningTime="2026-03-16 00:20:55.426371859 +0000 UTC m=+864.026470309" watchObservedRunningTime="2026-03-16 00:20:55.434168513 +0000 UTC m=+864.034266943" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.458185 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-246kv"] Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.466110 4983 scope.go:117] "RemoveContainer" containerID="cf4b26abd512b4e809f852ba2adfc32b6dc9094135d78aae3f567c7db9c58b4d" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.468784 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-246kv"] Mar 16 00:20:56 crc kubenswrapper[4983]: I0316 00:20:56.101419 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" path="/var/lib/kubelet/pods/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb/volumes" Mar 16 00:20:56 crc kubenswrapper[4983]: I0316 00:20:56.102216 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" path="/var/lib/kubelet/pods/e93de2c7-8794-463c-9a2d-ac74246f35b7/volumes" Mar 16 00:20:56 crc kubenswrapper[4983]: I0316 00:20:56.286737 4983 generic.go:334] "Generic (PLEG): container finished" podID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerID="018414fffea206a4b55f418bf8e92cf5e98ded9ca1caa4406ee97e9a967f5a8e" exitCode=0 Mar 16 00:20:56 crc kubenswrapper[4983]: I0316 00:20:56.286791 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" event={"ID":"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a","Type":"ContainerDied","Data":"018414fffea206a4b55f418bf8e92cf5e98ded9ca1caa4406ee97e9a967f5a8e"} Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.118857 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.243592 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-bundle\") pod \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.243645 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-696hx\" (UniqueName: \"kubernetes.io/projected/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-kube-api-access-696hx\") pod \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.243711 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-util\") pod \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.248152 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-bundle" (OuterVolumeSpecName: "bundle") pod "cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" (UID: "cd45ab45-645e-45d3-a9eb-a3d1392b5f7a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.254259 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-kube-api-access-696hx" (OuterVolumeSpecName: "kube-api-access-696hx") pod "cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" (UID: "cd45ab45-645e-45d3-a9eb-a3d1392b5f7a"). InnerVolumeSpecName "kube-api-access-696hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.258859 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-util" (OuterVolumeSpecName: "util") pod "cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" (UID: "cd45ab45-645e-45d3-a9eb-a3d1392b5f7a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.314155 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" event={"ID":"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a","Type":"ContainerDied","Data":"d7dd8b01edd106061e6d5e680eb573feed395612e08671515bd58dfbd61f800d"} Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.314457 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7dd8b01edd106061e6d5e680eb573feed395612e08671515bd58dfbd61f800d" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.314411 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.344719 4983 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.344766 4983 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.344780 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-696hx\" (UniqueName: \"kubernetes.io/projected/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-kube-api-access-696hx\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:59 crc kubenswrapper[4983]: I0316 00:20:59.322599 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" event={"ID":"7872b362-5118-4058-abba-048e0a81ecff","Type":"ContainerStarted","Data":"5c73e1ae7281bfb93b4a40e8701c75b7046f62a853b8f96c7a8ba4ab9015050c"} Mar 16 00:20:59 crc kubenswrapper[4983]: I0316 00:20:59.344396 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" podStartSLOduration=8.606621344 podStartE2EDuration="12.344376838s" podCreationTimestamp="2026-03-16 00:20:47 +0000 UTC" firstStartedPulling="2026-03-16 00:20:54.445050796 +0000 UTC m=+863.045149226" lastFinishedPulling="2026-03-16 00:20:58.18280629 +0000 UTC m=+866.782904720" observedRunningTime="2026-03-16 00:20:59.341051711 +0000 UTC m=+867.941150141" watchObservedRunningTime="2026-03-16 00:20:59.344376838 +0000 UTC m=+867.944475268" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.786741 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787163 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="pull" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787174 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="pull" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787187 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="extract-utilities" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787195 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="extract-utilities" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787205 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="extract-content" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787211 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="extract-content" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787217 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="extract" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787223 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="extract" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787229 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="registry-server" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787235 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="registry-server" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787245 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="extract-utilities" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787250 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="extract-utilities" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787259 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="util" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787265 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="util" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787274 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="extract-content" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787279 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="extract-content" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787285 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="registry-server" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787291 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="registry-server" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787390 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="registry-server" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787399 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="extract" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787410 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="registry-server" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.788105 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.790240 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.790240 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.790877 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.790898 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.790906 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.791225 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.791851 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.791911 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-4rm6d" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.791855 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.815994 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.875855 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.875913 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.875990 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876032 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876049 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876134 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876212 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876242 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876298 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876328 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876358 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876404 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876435 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876463 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876529 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977485 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977550 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977588 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977615 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977650 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977678 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977701 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977741 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977782 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977814 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977838 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977857 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977896 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977926 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977949 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.978508 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.978564 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.979402 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.983065 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.983232 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.984070 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.984492 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.996925 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.000590 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.002202 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.003009 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.003845 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.006268 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.009549 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.009907 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.107391 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.723181 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:21:04 crc kubenswrapper[4983]: I0316 00:21:04.193823 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:21:04 crc kubenswrapper[4983]: I0316 00:21:04.208476 4983 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:21:04 crc kubenswrapper[4983]: I0316 00:21:04.356829 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e","Type":"ContainerStarted","Data":"71ea766235047027981feea4dc6ee07b3bf940319a59262e0b075fa246950336"} Mar 16 00:21:04 crc kubenswrapper[4983]: I0316 00:21:04.358013 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" event={"ID":"5f1c8286-7638-43ad-bfec-fe7210fa4d73","Type":"ContainerStarted","Data":"d2920cd923d169ed87512c952802f3cd88209f9f6e10e2f0395529589d2811d5"} Mar 16 00:21:06 crc kubenswrapper[4983]: I0316 00:21:06.019055 4983 scope.go:117] "RemoveContainer" containerID="5fe0de833b2b27c1bfe835628ef9c6dca727580c2781fda123b15ad86663176a" Mar 16 00:21:12 crc kubenswrapper[4983]: I0316 00:21:12.143309 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" podStartSLOduration=18.421357421 podStartE2EDuration="28.143294463s" podCreationTimestamp="2026-03-16 00:20:44 +0000 UTC" firstStartedPulling="2026-03-16 00:20:54.301285318 +0000 UTC m=+862.901383748" lastFinishedPulling="2026-03-16 00:21:04.02322236 +0000 UTC m=+872.623320790" observedRunningTime="2026-03-16 00:21:04.372574716 +0000 UTC m=+872.972673146" watchObservedRunningTime="2026-03-16 00:21:12.143294463 +0000 UTC m=+880.743392893" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.154245 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l"] Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.155380 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.156747 4983 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-k4mnx" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.157271 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.159803 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.175550 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l"] Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.302581 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cdf343fd-5723-4e49-ad01-837bb0bbbed2-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-5789l\" (UID: \"cdf343fd-5723-4e49-ad01-837bb0bbbed2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.302658 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv5vt\" (UniqueName: \"kubernetes.io/projected/cdf343fd-5723-4e49-ad01-837bb0bbbed2-kube-api-access-fv5vt\") pod \"cert-manager-operator-controller-manager-5586865c96-5789l\" (UID: \"cdf343fd-5723-4e49-ad01-837bb0bbbed2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.404356 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cdf343fd-5723-4e49-ad01-837bb0bbbed2-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-5789l\" (UID: \"cdf343fd-5723-4e49-ad01-837bb0bbbed2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.404438 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv5vt\" (UniqueName: \"kubernetes.io/projected/cdf343fd-5723-4e49-ad01-837bb0bbbed2-kube-api-access-fv5vt\") pod \"cert-manager-operator-controller-manager-5586865c96-5789l\" (UID: \"cdf343fd-5723-4e49-ad01-837bb0bbbed2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.405199 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cdf343fd-5723-4e49-ad01-837bb0bbbed2-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-5789l\" (UID: \"cdf343fd-5723-4e49-ad01-837bb0bbbed2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.420931 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv5vt\" (UniqueName: \"kubernetes.io/projected/cdf343fd-5723-4e49-ad01-837bb0bbbed2-kube-api-access-fv5vt\") pod \"cert-manager-operator-controller-manager-5586865c96-5789l\" (UID: \"cdf343fd-5723-4e49-ad01-837bb0bbbed2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.480183 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:26 crc kubenswrapper[4983]: E0316 00:21:26.369597 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.connect.redhat.com/elastic/elasticsearch:7.17.20" Mar 16 00:21:26 crc kubenswrapper[4983]: E0316 00:21:26.372543 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:elastic-internal-init-filesystem,Image:registry.connect.redhat.com/elastic/elasticsearch:7.17.20,Command:[bash -c /mnt/elastic-internal/scripts/prepare-fs.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:HEADLESS_SERVICE_NAME,Value:elasticsearch-es-default,ValueFrom:nil,},EnvVar{Name:PROBE_PASSWORD_PATH,Value:/mnt/elastic-internal/pod-mounted-users/elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:PROBE_USERNAME,Value:elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:READINESS_PROBE_PROTOCOL,Value:https,ValueFrom:nil,},EnvVar{Name:NSS_SDB_USE_CACHE,Value:no,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:downward-api,ReadOnly:true,MountPath:/mnt/elastic-internal/downward-api,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-bin-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-bin-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config,ReadOnly:true,MountPath:/mnt/elastic-internal/elasticsearch-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-config-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-plugins-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-plugins-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-http-certificates,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/http-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-probe-user,ReadOnly:true,MountPath:/mnt/elastic-internal/pod-mounted-users,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-remote-certificate-authorities,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/transport-remote-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-scripts,ReadOnly:true,MountPath:/mnt/elastic-internal/scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-transport-certificates,ReadOnly:true,MountPath:/mnt/elastic-internal/transport-certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-unicast-hosts,ReadOnly:true,MountPath:/mnt/elastic-internal/unicast-hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-xpack-file-realm,ReadOnly:true,MountPath:/mnt/elastic-internal/xpack-file-realm,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-data,ReadOnly:false,MountPath:/usr/share/elasticsearch/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-logs,ReadOnly:false,MountPath:/usr/share/elasticsearch/logs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-volume,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod elasticsearch-es-default-0_service-telemetry(56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 16 00:21:26 crc kubenswrapper[4983]: E0316 00:21:26.373908 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" Mar 16 00:21:26 crc kubenswrapper[4983]: E0316 00:21:26.524606 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" Mar 16 00:21:26 crc kubenswrapper[4983]: I0316 00:21:26.671358 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:21:26 crc kubenswrapper[4983]: I0316 00:21:26.684586 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l"] Mar 16 00:21:26 crc kubenswrapper[4983]: I0316 00:21:26.724543 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:21:27 crc kubenswrapper[4983]: I0316 00:21:27.529216 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" event={"ID":"cdf343fd-5723-4e49-ad01-837bb0bbbed2","Type":"ContainerStarted","Data":"88ba03d885765707fa54832ae7f4ef16e3e4330bf9b580ccf51c7eabc286c451"} Mar 16 00:21:27 crc kubenswrapper[4983]: E0316 00:21:27.530949 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" Mar 16 00:21:28 crc kubenswrapper[4983]: E0316 00:21:28.536527 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" Mar 16 00:21:30 crc kubenswrapper[4983]: I0316 00:21:30.547072 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" event={"ID":"cdf343fd-5723-4e49-ad01-837bb0bbbed2","Type":"ContainerStarted","Data":"302eaa980197387ded55e9ed71a52cd946778c6cecd618cf83a5d42430c84019"} Mar 16 00:21:30 crc kubenswrapper[4983]: I0316 00:21:30.565281 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" podStartSLOduration=11.84433926 podStartE2EDuration="14.56526047s" podCreationTimestamp="2026-03-16 00:21:16 +0000 UTC" firstStartedPulling="2026-03-16 00:21:26.694398407 +0000 UTC m=+895.294496837" lastFinishedPulling="2026-03-16 00:21:29.415319617 +0000 UTC m=+898.015418047" observedRunningTime="2026-03-16 00:21:30.561373668 +0000 UTC m=+899.161472108" watchObservedRunningTime="2026-03-16 00:21:30.56526047 +0000 UTC m=+899.165358900" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.384155 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-jbjkj"] Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.385438 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.387940 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.388352 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.389527 4983 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-l52zx" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.395676 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-jbjkj"] Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.548426 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bmw2\" (UniqueName: \"kubernetes.io/projected/1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef-kube-api-access-9bmw2\") pod \"cert-manager-webhook-6888856db4-jbjkj\" (UID: \"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef\") " pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.548504 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-jbjkj\" (UID: \"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef\") " pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.649877 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bmw2\" (UniqueName: \"kubernetes.io/projected/1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef-kube-api-access-9bmw2\") pod \"cert-manager-webhook-6888856db4-jbjkj\" (UID: \"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef\") " pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.649957 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-jbjkj\" (UID: \"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef\") " pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.667229 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-jbjkj\" (UID: \"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef\") " pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.669355 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bmw2\" (UniqueName: \"kubernetes.io/projected/1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef-kube-api-access-9bmw2\") pod \"cert-manager-webhook-6888856db4-jbjkj\" (UID: \"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef\") " pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.699480 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.912656 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-jbjkj"] Mar 16 00:21:33 crc kubenswrapper[4983]: W0316 00:21:33.919859 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb1bc27_146d_4df6_9e00_7e0cfb7f28ef.slice/crio-bf83c2b5f6c7eac36150a9a2baaabe47679fe50badd2e3d73792a039cfff3601 WatchSource:0}: Error finding container bf83c2b5f6c7eac36150a9a2baaabe47679fe50badd2e3d73792a039cfff3601: Status 404 returned error can't find the container with id bf83c2b5f6c7eac36150a9a2baaabe47679fe50badd2e3d73792a039cfff3601 Mar 16 00:21:34 crc kubenswrapper[4983]: I0316 00:21:34.569574 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" event={"ID":"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef","Type":"ContainerStarted","Data":"bf83c2b5f6c7eac36150a9a2baaabe47679fe50badd2e3d73792a039cfff3601"} Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.509460 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-g9j58"] Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.510291 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.512117 4983 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-87bvp" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.517311 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-g9j58"] Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.671965 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjcrw\" (UniqueName: \"kubernetes.io/projected/4c66c255-b5f4-4c72-8902-7225df93821d-kube-api-access-hjcrw\") pod \"cert-manager-cainjector-5545bd876-g9j58\" (UID: \"4c66c255-b5f4-4c72-8902-7225df93821d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.672036 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c66c255-b5f4-4c72-8902-7225df93821d-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-g9j58\" (UID: \"4c66c255-b5f4-4c72-8902-7225df93821d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.773301 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c66c255-b5f4-4c72-8902-7225df93821d-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-g9j58\" (UID: \"4c66c255-b5f4-4c72-8902-7225df93821d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.773412 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjcrw\" (UniqueName: \"kubernetes.io/projected/4c66c255-b5f4-4c72-8902-7225df93821d-kube-api-access-hjcrw\") pod \"cert-manager-cainjector-5545bd876-g9j58\" (UID: \"4c66c255-b5f4-4c72-8902-7225df93821d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.795386 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjcrw\" (UniqueName: \"kubernetes.io/projected/4c66c255-b5f4-4c72-8902-7225df93821d-kube-api-access-hjcrw\") pod \"cert-manager-cainjector-5545bd876-g9j58\" (UID: \"4c66c255-b5f4-4c72-8902-7225df93821d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.795394 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c66c255-b5f4-4c72-8902-7225df93821d-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-g9j58\" (UID: \"4c66c255-b5f4-4c72-8902-7225df93821d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.871868 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:36 crc kubenswrapper[4983]: I0316 00:21:36.068735 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-g9j58"] Mar 16 00:21:36 crc kubenswrapper[4983]: W0316 00:21:36.078790 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c66c255_b5f4_4c72_8902_7225df93821d.slice/crio-5fb61081742f70061662414eaa7b162bbd1b672e48dcca7273647c0e58314b3a WatchSource:0}: Error finding container 5fb61081742f70061662414eaa7b162bbd1b672e48dcca7273647c0e58314b3a: Status 404 returned error can't find the container with id 5fb61081742f70061662414eaa7b162bbd1b672e48dcca7273647c0e58314b3a Mar 16 00:21:36 crc kubenswrapper[4983]: I0316 00:21:36.593298 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" event={"ID":"4c66c255-b5f4-4c72-8902-7225df93821d","Type":"ContainerStarted","Data":"5fb61081742f70061662414eaa7b162bbd1b672e48dcca7273647c0e58314b3a"} Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.899436 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.901560 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.903400 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.903907 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.904132 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.904224 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.921275 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.922708 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.922911 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-push\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.922953 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923002 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923038 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923068 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923101 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923127 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923154 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923208 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923432 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923458 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkn9w\" (UniqueName: \"kubernetes.io/projected/f8dccbc9-1a91-4587-84d0-7e4171bb6632-kube-api-access-mkn9w\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.023981 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024022 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024055 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024076 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024093 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkn9w\" (UniqueName: \"kubernetes.io/projected/f8dccbc9-1a91-4587-84d0-7e4171bb6632-kube-api-access-mkn9w\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024121 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024139 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024163 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-push\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024292 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024317 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024356 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024388 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024409 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024439 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024527 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024689 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024828 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024963 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.025434 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.025690 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.025718 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.031722 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.040203 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-push\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.067414 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkn9w\" (UniqueName: \"kubernetes.io/projected/f8dccbc9-1a91-4587-84d0-7e4171bb6632-kube-api-access-mkn9w\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.226366 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.533577 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:21:38 crc kubenswrapper[4983]: W0316 00:21:38.545107 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8dccbc9_1a91_4587_84d0_7e4171bb6632.slice/crio-a31730e37358485b8cf94523b53921f2ef3ecc2e2422c2578546b9f85c6460f1 WatchSource:0}: Error finding container a31730e37358485b8cf94523b53921f2ef3ecc2e2422c2578546b9f85c6460f1: Status 404 returned error can't find the container with id a31730e37358485b8cf94523b53921f2ef3ecc2e2422c2578546b9f85c6460f1 Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.606242 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" event={"ID":"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef","Type":"ContainerStarted","Data":"8b44564a30d00f9515a3f3d10097b85c3ac93f1f2f3362e3a7587e9f1cddc12a"} Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.606538 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.608138 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" event={"ID":"4c66c255-b5f4-4c72-8902-7225df93821d","Type":"ContainerStarted","Data":"68447b504f8ad1a0f6b4f3a2aae0e78ea502ce8b7bfd4b19a53c2bc53c1805c2"} Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.609140 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"f8dccbc9-1a91-4587-84d0-7e4171bb6632","Type":"ContainerStarted","Data":"a31730e37358485b8cf94523b53921f2ef3ecc2e2422c2578546b9f85c6460f1"} Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.629601 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" podStartSLOduration=1.152113348 podStartE2EDuration="5.629577602s" podCreationTimestamp="2026-03-16 00:21:33 +0000 UTC" firstStartedPulling="2026-03-16 00:21:33.92251409 +0000 UTC m=+902.522612510" lastFinishedPulling="2026-03-16 00:21:38.399978334 +0000 UTC m=+907.000076764" observedRunningTime="2026-03-16 00:21:38.625341509 +0000 UTC m=+907.225439969" watchObservedRunningTime="2026-03-16 00:21:38.629577602 +0000 UTC m=+907.229676032" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.647569 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" podStartSLOduration=1.328880586 podStartE2EDuration="3.647549942s" podCreationTimestamp="2026-03-16 00:21:35 +0000 UTC" firstStartedPulling="2026-03-16 00:21:36.081671737 +0000 UTC m=+904.681770167" lastFinishedPulling="2026-03-16 00:21:38.400341093 +0000 UTC m=+907.000439523" observedRunningTime="2026-03-16 00:21:38.643576306 +0000 UTC m=+907.243674736" watchObservedRunningTime="2026-03-16 00:21:38.647549942 +0000 UTC m=+907.247648372" Mar 16 00:21:43 crc kubenswrapper[4983]: I0316 00:21:43.702405 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:44 crc kubenswrapper[4983]: I0316 00:21:44.646212 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e","Type":"ContainerStarted","Data":"0767b6d75d0337c8d943b36c8f8e475d03c2c0b56f536e438a9cbe93495c561e"} Mar 16 00:21:45 crc kubenswrapper[4983]: I0316 00:21:45.654178 4983 generic.go:334] "Generic (PLEG): container finished" podID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerID="85da0c85b3975564bbfc7df9911d364082da5edf592effa9165ac2fcdf52c986" exitCode=0 Mar 16 00:21:45 crc kubenswrapper[4983]: I0316 00:21:45.654292 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"f8dccbc9-1a91-4587-84d0-7e4171bb6632","Type":"ContainerDied","Data":"85da0c85b3975564bbfc7df9911d364082da5edf592effa9165ac2fcdf52c986"} Mar 16 00:21:46 crc kubenswrapper[4983]: I0316 00:21:46.663459 4983 generic.go:334] "Generic (PLEG): container finished" podID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" containerID="0767b6d75d0337c8d943b36c8f8e475d03c2c0b56f536e438a9cbe93495c561e" exitCode=0 Mar 16 00:21:46 crc kubenswrapper[4983]: I0316 00:21:46.663570 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e","Type":"ContainerDied","Data":"0767b6d75d0337c8d943b36c8f8e475d03c2c0b56f536e438a9cbe93495c561e"} Mar 16 00:21:46 crc kubenswrapper[4983]: I0316 00:21:46.668689 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"f8dccbc9-1a91-4587-84d0-7e4171bb6632","Type":"ContainerStarted","Data":"f66a180fc92b0bfb814379aff091a386a5fc53cd37520e91fa22093fc36712cd"} Mar 16 00:21:47 crc kubenswrapper[4983]: I0316 00:21:47.976685 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=5.157988149 podStartE2EDuration="10.976664807s" podCreationTimestamp="2026-03-16 00:21:37 +0000 UTC" firstStartedPulling="2026-03-16 00:21:38.547658216 +0000 UTC m=+907.147756636" lastFinishedPulling="2026-03-16 00:21:44.366334854 +0000 UTC m=+912.966433294" observedRunningTime="2026-03-16 00:21:46.719054661 +0000 UTC m=+915.319153091" watchObservedRunningTime="2026-03-16 00:21:47.976664807 +0000 UTC m=+916.576763237" Mar 16 00:21:47 crc kubenswrapper[4983]: I0316 00:21:47.980664 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:21:48 crc kubenswrapper[4983]: I0316 00:21:48.680837 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerName="docker-build" containerID="cri-o://f66a180fc92b0bfb814379aff091a386a5fc53cd37520e91fa22093fc36712cd" gracePeriod=30 Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.660219 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.661848 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.664514 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.664604 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.664605 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.684325 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737319 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737362 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737490 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737549 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-push\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737565 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737592 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737626 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737645 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcpqk\" (UniqueName: \"kubernetes.io/projected/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-kube-api-access-dcpqk\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737664 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737689 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737704 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737846 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839221 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839284 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839310 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839369 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839399 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-push\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839420 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839438 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839456 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839474 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcpqk\" (UniqueName: \"kubernetes.io/projected/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-kube-api-access-dcpqk\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839489 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839510 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839528 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839553 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839675 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839925 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.840169 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.840367 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.840376 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.840700 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.841079 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.842082 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.847577 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-push\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.852709 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.865642 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcpqk\" (UniqueName: \"kubernetes.io/projected/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-kube-api-access-dcpqk\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.981569 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:50 crc kubenswrapper[4983]: I0316 00:21:50.845158 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 16 00:21:51 crc kubenswrapper[4983]: I0316 00:21:51.737286 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerStarted","Data":"b616705931879247088ce47bd9e865c198d5a42dd59830a5a6c9d692a05b1e4c"} Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.247396 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-cwdn9"] Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.248102 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.250552 4983 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xwfm5" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.256720 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-cwdn9"] Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.268028 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95208db3-d53d-43c0-9b2c-cc4c5b3236d8-bound-sa-token\") pod \"cert-manager-545d4d4674-cwdn9\" (UID: \"95208db3-d53d-43c0-9b2c-cc4c5b3236d8\") " pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.268100 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wdnp\" (UniqueName: \"kubernetes.io/projected/95208db3-d53d-43c0-9b2c-cc4c5b3236d8-kube-api-access-5wdnp\") pod \"cert-manager-545d4d4674-cwdn9\" (UID: \"95208db3-d53d-43c0-9b2c-cc4c5b3236d8\") " pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.369543 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95208db3-d53d-43c0-9b2c-cc4c5b3236d8-bound-sa-token\") pod \"cert-manager-545d4d4674-cwdn9\" (UID: \"95208db3-d53d-43c0-9b2c-cc4c5b3236d8\") " pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.369726 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wdnp\" (UniqueName: \"kubernetes.io/projected/95208db3-d53d-43c0-9b2c-cc4c5b3236d8-kube-api-access-5wdnp\") pod \"cert-manager-545d4d4674-cwdn9\" (UID: \"95208db3-d53d-43c0-9b2c-cc4c5b3236d8\") " pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.396479 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95208db3-d53d-43c0-9b2c-cc4c5b3236d8-bound-sa-token\") pod \"cert-manager-545d4d4674-cwdn9\" (UID: \"95208db3-d53d-43c0-9b2c-cc4c5b3236d8\") " pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.398029 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wdnp\" (UniqueName: \"kubernetes.io/projected/95208db3-d53d-43c0-9b2c-cc4c5b3236d8-kube-api-access-5wdnp\") pod \"cert-manager-545d4d4674-cwdn9\" (UID: \"95208db3-d53d-43c0-9b2c-cc4c5b3236d8\") " pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: E0316 00:21:52.503978 4983 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56cd8a8b_9f7b_45aa_ad0a_0c84fd70722e.slice/crio-dd5e6f0a89d0c30f1ed473612d847f5dd2a8615ae5534f0387176c8fa5f060d5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56cd8a8b_9f7b_45aa_ad0a_0c84fd70722e.slice/crio-conmon-dd5e6f0a89d0c30f1ed473612d847f5dd2a8615ae5534f0387176c8fa5f060d5.scope\": RecentStats: unable to find data in memory cache]" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.578252 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.743950 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_f8dccbc9-1a91-4587-84d0-7e4171bb6632/docker-build/0.log" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.744570 4983 generic.go:334] "Generic (PLEG): container finished" podID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerID="f66a180fc92b0bfb814379aff091a386a5fc53cd37520e91fa22093fc36712cd" exitCode=1 Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.744653 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"f8dccbc9-1a91-4587-84d0-7e4171bb6632","Type":"ContainerDied","Data":"f66a180fc92b0bfb814379aff091a386a5fc53cd37520e91fa22093fc36712cd"} Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.746160 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerStarted","Data":"eb890a8487aac34c05d790c7e0c2ffc24a8e23e29e057ce2dea6cae90f0436c3"} Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.748876 4983 generic.go:334] "Generic (PLEG): container finished" podID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" containerID="dd5e6f0a89d0c30f1ed473612d847f5dd2a8615ae5534f0387176c8fa5f060d5" exitCode=0 Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.748917 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e","Type":"ContainerDied","Data":"dd5e6f0a89d0c30f1ed473612d847f5dd2a8615ae5534f0387176c8fa5f060d5"} Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.998806 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-cwdn9"] Mar 16 00:21:53 crc kubenswrapper[4983]: W0316 00:21:53.005946 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95208db3_d53d_43c0_9b2c_cc4c5b3236d8.slice/crio-3b5f2da13179be49dc4681871e7ceb1b9386a19105ead33b2c089d7e2c6b90a5 WatchSource:0}: Error finding container 3b5f2da13179be49dc4681871e7ceb1b9386a19105ead33b2c089d7e2c6b90a5: Status 404 returned error can't find the container with id 3b5f2da13179be49dc4681871e7ceb1b9386a19105ead33b2c089d7e2c6b90a5 Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.521904 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_f8dccbc9-1a91-4587-84d0-7e4171bb6632/docker-build/0.log" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.523398 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584273 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-system-configs\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584314 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-pull\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584345 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildworkdir\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584405 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-proxy-ca-bundles\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584457 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-ca-bundles\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584481 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-run\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584500 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-root\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584516 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildcachedir\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584537 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkn9w\" (UniqueName: \"kubernetes.io/projected/f8dccbc9-1a91-4587-84d0-7e4171bb6632-kube-api-access-mkn9w\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584577 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-blob-cache\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584621 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-node-pullsecrets\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584647 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-push\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584909 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584963 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.585279 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.585351 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.585605 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.585892 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.588658 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.589285 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.590236 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.590274 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.590259 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.593058 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8dccbc9-1a91-4587-84d0-7e4171bb6632-kube-api-access-mkn9w" (OuterVolumeSpecName: "kube-api-access-mkn9w") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "kube-api-access-mkn9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686028 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686060 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686071 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686081 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686089 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686097 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686105 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686114 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686123 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686131 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686138 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkn9w\" (UniqueName: \"kubernetes.io/projected/f8dccbc9-1a91-4587-84d0-7e4171bb6632-kube-api-access-mkn9w\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686148 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.756437 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-cwdn9" event={"ID":"95208db3-d53d-43c0-9b2c-cc4c5b3236d8","Type":"ContainerStarted","Data":"d14743bab2e4da6b2538f7db284bd57dde768c5153fe3c6438dcfc742cb1f90c"} Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.756482 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-cwdn9" event={"ID":"95208db3-d53d-43c0-9b2c-cc4c5b3236d8","Type":"ContainerStarted","Data":"3b5f2da13179be49dc4681871e7ceb1b9386a19105ead33b2c089d7e2c6b90a5"} Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.759982 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e","Type":"ContainerStarted","Data":"577fcf1cd2a5ecb4f5447e156045b4b6272fafffead51c5a949cf15995e185af"} Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.760214 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.761588 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_f8dccbc9-1a91-4587-84d0-7e4171bb6632/docker-build/0.log" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.762068 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"f8dccbc9-1a91-4587-84d0-7e4171bb6632","Type":"ContainerDied","Data":"a31730e37358485b8cf94523b53921f2ef3ecc2e2422c2578546b9f85c6460f1"} Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.762096 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.762126 4983 scope.go:117] "RemoveContainer" containerID="f66a180fc92b0bfb814379aff091a386a5fc53cd37520e91fa22093fc36712cd" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.775995 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-cwdn9" podStartSLOduration=1.775977581 podStartE2EDuration="1.775977581s" podCreationTimestamp="2026-03-16 00:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:21:53.773570757 +0000 UTC m=+922.373669197" watchObservedRunningTime="2026-03-16 00:21:53.775977581 +0000 UTC m=+922.376076011" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.785570 4983 scope.go:117] "RemoveContainer" containerID="85da0c85b3975564bbfc7df9911d364082da5edf592effa9165ac2fcdf52c986" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.817678 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=13.677571118 podStartE2EDuration="53.817663203s" podCreationTimestamp="2026-03-16 00:21:00 +0000 UTC" firstStartedPulling="2026-03-16 00:21:04.208243549 +0000 UTC m=+872.808341989" lastFinishedPulling="2026-03-16 00:21:44.348335644 +0000 UTC m=+912.948434074" observedRunningTime="2026-03-16 00:21:53.817363645 +0000 UTC m=+922.417462105" watchObservedRunningTime="2026-03-16 00:21:53.817663203 +0000 UTC m=+922.417761633" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.834730 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.842435 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:21:54 crc kubenswrapper[4983]: I0316 00:21:54.100736 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" path="/var/lib/kubelet/pods/f8dccbc9-1a91-4587-84d0-7e4171bb6632/volumes" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.140085 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560342-544h5"] Mar 16 00:22:00 crc kubenswrapper[4983]: E0316 00:22:00.140748 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerName="docker-build" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.140775 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerName="docker-build" Mar 16 00:22:00 crc kubenswrapper[4983]: E0316 00:22:00.140791 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerName="manage-dockerfile" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.140797 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerName="manage-dockerfile" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.140903 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerName="docker-build" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.141301 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.145008 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.147851 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.148974 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.152832 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-544h5"] Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.257747 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqkf4\" (UniqueName: \"kubernetes.io/projected/d0d707a0-1b40-4364-9a61-cde76e2c80a1-kube-api-access-gqkf4\") pod \"auto-csr-approver-29560342-544h5\" (UID: \"d0d707a0-1b40-4364-9a61-cde76e2c80a1\") " pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.358949 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqkf4\" (UniqueName: \"kubernetes.io/projected/d0d707a0-1b40-4364-9a61-cde76e2c80a1-kube-api-access-gqkf4\") pod \"auto-csr-approver-29560342-544h5\" (UID: \"d0d707a0-1b40-4364-9a61-cde76e2c80a1\") " pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.382505 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqkf4\" (UniqueName: \"kubernetes.io/projected/d0d707a0-1b40-4364-9a61-cde76e2c80a1-kube-api-access-gqkf4\") pod \"auto-csr-approver-29560342-544h5\" (UID: \"d0d707a0-1b40-4364-9a61-cde76e2c80a1\") " pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.457697 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.690273 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-544h5"] Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.800952 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-544h5" event={"ID":"d0d707a0-1b40-4364-9a61-cde76e2c80a1","Type":"ContainerStarted","Data":"03d9f251332a8028f445c59c6b0b7667a3bf4475e58aae573bb71b533169814c"} Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.802124 4983 generic.go:334] "Generic (PLEG): container finished" podID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerID="eb890a8487aac34c05d790c7e0c2ffc24a8e23e29e057ce2dea6cae90f0436c3" exitCode=0 Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.802155 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerDied","Data":"eb890a8487aac34c05d790c7e0c2ffc24a8e23e29e057ce2dea6cae90f0436c3"} Mar 16 00:22:01 crc kubenswrapper[4983]: I0316 00:22:01.811139 4983 generic.go:334] "Generic (PLEG): container finished" podID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerID="0bcb454de18d51c1deb1d605b1e69cd13d48c30efe37006944940aa52ae50544" exitCode=0 Mar 16 00:22:01 crc kubenswrapper[4983]: I0316 00:22:01.811254 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerDied","Data":"0bcb454de18d51c1deb1d605b1e69cd13d48c30efe37006944940aa52ae50544"} Mar 16 00:22:01 crc kubenswrapper[4983]: I0316 00:22:01.840509 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_4c20598e-7255-42e7-9ac2-e6e58a8e9c88/manage-dockerfile/0.log" Mar 16 00:22:02 crc kubenswrapper[4983]: I0316 00:22:02.822004 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerStarted","Data":"ad0129f0fbcf710c158fc5bf41eaefcab120c1afd0eb3d2ef815d5472ca56907"} Mar 16 00:22:02 crc kubenswrapper[4983]: I0316 00:22:02.824437 4983 generic.go:334] "Generic (PLEG): container finished" podID="d0d707a0-1b40-4364-9a61-cde76e2c80a1" containerID="b6146a6dfae8df822feda4cd12d6f532571e6f993bc0ef397b108d23a0fa9361" exitCode=0 Mar 16 00:22:02 crc kubenswrapper[4983]: I0316 00:22:02.824467 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-544h5" event={"ID":"d0d707a0-1b40-4364-9a61-cde76e2c80a1","Type":"ContainerDied","Data":"b6146a6dfae8df822feda4cd12d6f532571e6f993bc0ef397b108d23a0fa9361"} Mar 16 00:22:02 crc kubenswrapper[4983]: I0316 00:22:02.930410 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=13.930389417 podStartE2EDuration="13.930389417s" podCreationTimestamp="2026-03-16 00:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:22:02.891652774 +0000 UTC m=+931.491751204" watchObservedRunningTime="2026-03-16 00:22:02.930389417 +0000 UTC m=+931.530487857" Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.081291 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.213306 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqkf4\" (UniqueName: \"kubernetes.io/projected/d0d707a0-1b40-4364-9a61-cde76e2c80a1-kube-api-access-gqkf4\") pod \"d0d707a0-1b40-4364-9a61-cde76e2c80a1\" (UID: \"d0d707a0-1b40-4364-9a61-cde76e2c80a1\") " Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.221085 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0d707a0-1b40-4364-9a61-cde76e2c80a1-kube-api-access-gqkf4" (OuterVolumeSpecName: "kube-api-access-gqkf4") pod "d0d707a0-1b40-4364-9a61-cde76e2c80a1" (UID: "d0d707a0-1b40-4364-9a61-cde76e2c80a1"). InnerVolumeSpecName "kube-api-access-gqkf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.315007 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqkf4\" (UniqueName: \"kubernetes.io/projected/d0d707a0-1b40-4364-9a61-cde76e2c80a1-kube-api-access-gqkf4\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.835688 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-544h5" event={"ID":"d0d707a0-1b40-4364-9a61-cde76e2c80a1","Type":"ContainerDied","Data":"03d9f251332a8028f445c59c6b0b7667a3bf4475e58aae573bb71b533169814c"} Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.835725 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03d9f251332a8028f445c59c6b0b7667a3bf4475e58aae573bb71b533169814c" Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.835814 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:05 crc kubenswrapper[4983]: I0316 00:22:05.143877 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-6d4qf"] Mar 16 00:22:05 crc kubenswrapper[4983]: I0316 00:22:05.147832 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-6d4qf"] Mar 16 00:22:06 crc kubenswrapper[4983]: I0316 00:22:06.100422 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b56bb064-30c4-4aaf-a4d2-c81006425b62" path="/var/lib/kubelet/pods/b56bb064-30c4-4aaf-a4d2-c81006425b62/volumes" Mar 16 00:22:06 crc kubenswrapper[4983]: I0316 00:22:06.228656 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" containerName="elasticsearch" probeResult="failure" output=< Mar 16 00:22:06 crc kubenswrapper[4983]: {"timestamp": "2026-03-16T00:22:06+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 16 00:22:06 crc kubenswrapper[4983]: > Mar 16 00:22:06 crc kubenswrapper[4983]: I0316 00:22:06.417315 4983 scope.go:117] "RemoveContainer" containerID="a092715a78836d6cc7d08c15d4c8579198cd91313410de0ab11035815df03f19" Mar 16 00:22:11 crc kubenswrapper[4983]: I0316 00:22:11.359537 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:23 crc kubenswrapper[4983]: I0316 00:22:23.447858 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:22:23 crc kubenswrapper[4983]: I0316 00:22:23.448296 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:22:53 crc kubenswrapper[4983]: I0316 00:22:53.448849 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:22:53 crc kubenswrapper[4983]: I0316 00:22:53.449404 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:23:11 crc kubenswrapper[4983]: I0316 00:23:11.936228 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-755c7"] Mar 16 00:23:11 crc kubenswrapper[4983]: E0316 00:23:11.937153 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d707a0-1b40-4364-9a61-cde76e2c80a1" containerName="oc" Mar 16 00:23:11 crc kubenswrapper[4983]: I0316 00:23:11.937174 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d707a0-1b40-4364-9a61-cde76e2c80a1" containerName="oc" Mar 16 00:23:11 crc kubenswrapper[4983]: I0316 00:23:11.937689 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d707a0-1b40-4364-9a61-cde76e2c80a1" containerName="oc" Mar 16 00:23:11 crc kubenswrapper[4983]: I0316 00:23:11.940507 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:11 crc kubenswrapper[4983]: I0316 00:23:11.954981 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-755c7"] Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.104773 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-catalog-content\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.104901 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-utilities\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.104979 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9pqc\" (UniqueName: \"kubernetes.io/projected/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-kube-api-access-j9pqc\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.206339 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9pqc\" (UniqueName: \"kubernetes.io/projected/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-kube-api-access-j9pqc\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.206403 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-catalog-content\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.206459 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-utilities\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.207218 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-utilities\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.207398 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-catalog-content\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.238620 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9pqc\" (UniqueName: \"kubernetes.io/projected/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-kube-api-access-j9pqc\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.272741 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.587872 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-755c7"] Mar 16 00:23:13 crc kubenswrapper[4983]: I0316 00:23:13.287803 4983 generic.go:334] "Generic (PLEG): container finished" podID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerID="21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2" exitCode=0 Mar 16 00:23:13 crc kubenswrapper[4983]: I0316 00:23:13.287883 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerDied","Data":"21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2"} Mar 16 00:23:13 crc kubenswrapper[4983]: I0316 00:23:13.288220 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerStarted","Data":"170d15e1acedf5bba4266a2e8a9f558a7f2f0cc3269fc453b068a5bc544c83b9"} Mar 16 00:23:14 crc kubenswrapper[4983]: I0316 00:23:14.296188 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerStarted","Data":"3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba"} Mar 16 00:23:15 crc kubenswrapper[4983]: I0316 00:23:15.303574 4983 generic.go:334] "Generic (PLEG): container finished" podID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerID="3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba" exitCode=0 Mar 16 00:23:15 crc kubenswrapper[4983]: I0316 00:23:15.303616 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerDied","Data":"3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba"} Mar 16 00:23:17 crc kubenswrapper[4983]: I0316 00:23:17.316319 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerStarted","Data":"6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3"} Mar 16 00:23:17 crc kubenswrapper[4983]: I0316 00:23:17.336394 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-755c7" podStartSLOduration=2.868353231 podStartE2EDuration="6.336380068s" podCreationTimestamp="2026-03-16 00:23:11 +0000 UTC" firstStartedPulling="2026-03-16 00:23:13.289335471 +0000 UTC m=+1001.889433911" lastFinishedPulling="2026-03-16 00:23:16.757362308 +0000 UTC m=+1005.357460748" observedRunningTime="2026-03-16 00:23:17.332267609 +0000 UTC m=+1005.932366069" watchObservedRunningTime="2026-03-16 00:23:17.336380068 +0000 UTC m=+1005.936478498" Mar 16 00:23:20 crc kubenswrapper[4983]: I0316 00:23:20.335275 4983 generic.go:334] "Generic (PLEG): container finished" podID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerID="ad0129f0fbcf710c158fc5bf41eaefcab120c1afd0eb3d2ef815d5472ca56907" exitCode=0 Mar 16 00:23:20 crc kubenswrapper[4983]: I0316 00:23:20.335341 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerDied","Data":"ad0129f0fbcf710c158fc5bf41eaefcab120c1afd0eb3d2ef815d5472ca56907"} Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.627860 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730291 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildcachedir\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730398 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-system-configs\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730419 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730449 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-ca-bundles\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730485 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-run\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730538 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-pull\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730579 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-root\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730608 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-push\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730650 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-node-pullsecrets\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730683 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-proxy-ca-bundles\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730729 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildworkdir\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730800 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcpqk\" (UniqueName: \"kubernetes.io/projected/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-kube-api-access-dcpqk\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730836 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-blob-cache\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.731099 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.731085 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.731475 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.731795 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.732235 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.733263 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.737700 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.737747 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.742902 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-kube-api-access-dcpqk" (OuterVolumeSpecName: "kube-api-access-dcpqk") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "kube-api-access-dcpqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.767286 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832478 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832734 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832748 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832777 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832790 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832802 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832813 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832824 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832835 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcpqk\" (UniqueName: \"kubernetes.io/projected/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-kube-api-access-dcpqk\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.923660 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.934390 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.273199 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.273248 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.317901 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.352262 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerDied","Data":"b616705931879247088ce47bd9e865c198d5a42dd59830a5a6c9d692a05b1e4c"} Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.352312 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b616705931879247088ce47bd9e865c198d5a42dd59830a5a6c9d692a05b1e4c" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.352422 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.393543 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.554348 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-755c7"] Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.448805 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.448870 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.448917 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.449464 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"46c022992a1c1aeeb47c6d405474573b981c1ce7a0658e8eab3f5cf112a6afc5"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.449506 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://46c022992a1c1aeeb47c6d405474573b981c1ce7a0658e8eab3f5cf112a6afc5" gracePeriod=600 Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.603766 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.659909 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.371553 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="46c022992a1c1aeeb47c6d405474573b981c1ce7a0658e8eab3f5cf112a6afc5" exitCode=0 Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.371626 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"46c022992a1c1aeeb47c6d405474573b981c1ce7a0658e8eab3f5cf112a6afc5"} Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.372020 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"4952feacc34350796d6119e57c5c2963c6c739a2d4a6116de514f57eada3dedf"} Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.372069 4983 scope.go:117] "RemoveContainer" containerID="c46c4de7c35ace23617ad378a775dc0cdbe9c0cb791abead202e26dd6d103d18" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.372086 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-755c7" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="registry-server" containerID="cri-o://6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3" gracePeriod=2 Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.719781 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.873818 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-catalog-content\") pod \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.873910 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9pqc\" (UniqueName: \"kubernetes.io/projected/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-kube-api-access-j9pqc\") pod \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.874160 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-utilities\") pod \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.875148 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-utilities" (OuterVolumeSpecName: "utilities") pod "15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" (UID: "15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.884033 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-kube-api-access-j9pqc" (OuterVolumeSpecName: "kube-api-access-j9pqc") pod "15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" (UID: "15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35"). InnerVolumeSpecName "kube-api-access-j9pqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.940570 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" (UID: "15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.975807 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.976103 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.976118 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9pqc\" (UniqueName: \"kubernetes.io/projected/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-kube-api-access-j9pqc\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.387240 4983 generic.go:334] "Generic (PLEG): container finished" podID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerID="6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3" exitCode=0 Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.387295 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerDied","Data":"6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3"} Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.387363 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerDied","Data":"170d15e1acedf5bba4266a2e8a9f558a7f2f0cc3269fc453b068a5bc544c83b9"} Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.387394 4983 scope.go:117] "RemoveContainer" containerID="6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.387613 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.406107 4983 scope.go:117] "RemoveContainer" containerID="3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.444789 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-755c7"] Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.448429 4983 scope.go:117] "RemoveContainer" containerID="21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.455359 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-755c7"] Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.484043 4983 scope.go:117] "RemoveContainer" containerID="6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.484580 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3\": container with ID starting with 6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3 not found: ID does not exist" containerID="6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.484608 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3"} err="failed to get container status \"6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3\": rpc error: code = NotFound desc = could not find container \"6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3\": container with ID starting with 6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3 not found: ID does not exist" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.484631 4983 scope.go:117] "RemoveContainer" containerID="3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.485144 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba\": container with ID starting with 3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba not found: ID does not exist" containerID="3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.485167 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba"} err="failed to get container status \"3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba\": rpc error: code = NotFound desc = could not find container \"3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba\": container with ID starting with 3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba not found: ID does not exist" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.485184 4983 scope.go:117] "RemoveContainer" containerID="21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.485565 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2\": container with ID starting with 21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2 not found: ID does not exist" containerID="21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.485587 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2"} err="failed to get container status \"21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2\": rpc error: code = NotFound desc = could not find container \"21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2\": container with ID starting with 21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2 not found: ID does not exist" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.959990 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.961505 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="docker-build" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.961652 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="docker-build" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.961896 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="extract-content" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.962017 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="extract-content" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.962159 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="manage-dockerfile" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.962278 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="manage-dockerfile" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.962397 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="git-clone" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.962517 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="git-clone" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.962674 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="registry-server" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.963079 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="registry-server" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.963231 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="extract-utilities" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.963344 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="extract-utilities" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.963633 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="docker-build" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.963845 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="registry-server" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.965160 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.967891 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.968069 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.968515 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.969000 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.971265 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.010870 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-push\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.011048 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.011168 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.011380 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.101796 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" path="/var/lib/kubelet/pods/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35/volumes" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112277 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112309 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112328 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxmnb\" (UniqueName: \"kubernetes.io/projected/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-kube-api-access-bxmnb\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112346 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112383 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112432 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112512 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112570 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-push\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112598 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112625 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112644 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112743 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112843 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.113045 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.113244 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.118880 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-push\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213233 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213546 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxmnb\" (UniqueName: \"kubernetes.io/projected/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-kube-api-access-bxmnb\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213610 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213654 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213679 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213735 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213788 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213825 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.214124 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.214151 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.214512 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.214532 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.214605 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.214747 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.222120 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.240354 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxmnb\" (UniqueName: \"kubernetes.io/projected/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-kube-api-access-bxmnb\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.322589 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.538655 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:23:27 crc kubenswrapper[4983]: I0316 00:23:27.412171 4983 generic.go:334] "Generic (PLEG): container finished" podID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerID="bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159" exitCode=0 Mar 16 00:23:27 crc kubenswrapper[4983]: I0316 00:23:27.412228 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8","Type":"ContainerDied","Data":"bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159"} Mar 16 00:23:27 crc kubenswrapper[4983]: I0316 00:23:27.412287 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8","Type":"ContainerStarted","Data":"6801f811951081f606c13f90ca767a1c29a47094846fcf902240672ea47c4ae4"} Mar 16 00:23:28 crc kubenswrapper[4983]: I0316 00:23:28.434708 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8","Type":"ContainerStarted","Data":"9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad"} Mar 16 00:23:28 crc kubenswrapper[4983]: I0316 00:23:28.456200 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=3.456184498 podStartE2EDuration="3.456184498s" podCreationTimestamp="2026-03-16 00:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:23:28.453504247 +0000 UTC m=+1017.053602677" watchObservedRunningTime="2026-03-16 00:23:28.456184498 +0000 UTC m=+1017.056282928" Mar 16 00:23:36 crc kubenswrapper[4983]: I0316 00:23:36.482122 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:23:36 crc kubenswrapper[4983]: I0316 00:23:36.482932 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerName="docker-build" containerID="cri-o://9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad" gracePeriod=30 Mar 16 00:23:37 crc kubenswrapper[4983]: I0316 00:23:37.949341 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_d267f9a9-cbff-4b92-8a83-50f8d9bc80e8/docker-build/0.log" Mar 16 00:23:37 crc kubenswrapper[4983]: I0316 00:23:37.950962 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063380 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-root\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063457 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-ca-bundles\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063496 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxmnb\" (UniqueName: \"kubernetes.io/projected/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-kube-api-access-bxmnb\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063530 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-blob-cache\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063565 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-run\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063626 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-push\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063664 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildworkdir\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063704 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-system-configs\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063745 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildcachedir\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063878 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-proxy-ca-bundles\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063917 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-pull\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063957 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-node-pullsecrets\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.064411 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.064908 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.064972 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.065314 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.066128 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.066830 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.067204 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.070297 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-kube-api-access-bxmnb" (OuterVolumeSpecName: "kube-api-access-bxmnb") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "kube-api-access-bxmnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.071369 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.075903 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.119343 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 16 00:23:38 crc kubenswrapper[4983]: E0316 00:23:38.119592 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerName="manage-dockerfile" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.119604 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerName="manage-dockerfile" Mar 16 00:23:38 crc kubenswrapper[4983]: E0316 00:23:38.119614 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerName="docker-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.119620 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerName="docker-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.119737 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerName="docker-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.120695 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.122852 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.123720 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.124493 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.165930 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.165971 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.165984 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.165995 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166006 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166019 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166030 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166043 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxmnb\" (UniqueName: \"kubernetes.io/projected/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-kube-api-access-bxmnb\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166055 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166065 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166088 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.239821 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266707 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266745 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266792 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266808 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266832 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266861 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266881 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7gkl\" (UniqueName: \"kubernetes.io/projected/8045812f-963d-4b8f-ae8d-584addf74cae-kube-api-access-s7gkl\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266916 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266932 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266949 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266971 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.267000 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.267373 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368459 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368508 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368534 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368559 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368589 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368616 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368643 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7gkl\" (UniqueName: \"kubernetes.io/projected/8045812f-963d-4b8f-ae8d-584addf74cae-kube-api-access-s7gkl\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368704 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368727 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368768 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368799 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368827 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368816 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368907 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.369128 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.369179 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.369580 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.369832 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.369847 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.370671 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.371313 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.372685 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.398049 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.402140 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7gkl\" (UniqueName: \"kubernetes.io/projected/8045812f-963d-4b8f-ae8d-584addf74cae-kube-api-access-s7gkl\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.434301 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.501249 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_d267f9a9-cbff-4b92-8a83-50f8d9bc80e8/docker-build/0.log" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.501909 4983 generic.go:334] "Generic (PLEG): container finished" podID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerID="9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad" exitCode=1 Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.501970 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8","Type":"ContainerDied","Data":"9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad"} Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.501999 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.502012 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8","Type":"ContainerDied","Data":"6801f811951081f606c13f90ca767a1c29a47094846fcf902240672ea47c4ae4"} Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.502041 4983 scope.go:117] "RemoveContainer" containerID="9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.569530 4983 scope.go:117] "RemoveContainer" containerID="bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.589995 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.593542 4983 scope.go:117] "RemoveContainer" containerID="9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad" Mar 16 00:23:38 crc kubenswrapper[4983]: E0316 00:23:38.594099 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad\": container with ID starting with 9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad not found: ID does not exist" containerID="9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.594129 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad"} err="failed to get container status \"9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad\": rpc error: code = NotFound desc = could not find container \"9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad\": container with ID starting with 9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad not found: ID does not exist" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.594150 4983 scope.go:117] "RemoveContainer" containerID="bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159" Mar 16 00:23:38 crc kubenswrapper[4983]: E0316 00:23:38.594543 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159\": container with ID starting with bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159 not found: ID does not exist" containerID="bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.594563 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159"} err="failed to get container status \"bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159\": rpc error: code = NotFound desc = could not find container \"bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159\": container with ID starting with bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159 not found: ID does not exist" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.672328 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.729618 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.833564 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.837614 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:23:39 crc kubenswrapper[4983]: I0316 00:23:39.511385 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerStarted","Data":"e4ce240877f3e19ee8e22cd3f7f31da5a9601a66ff499b649538e63303f2f660"} Mar 16 00:23:39 crc kubenswrapper[4983]: I0316 00:23:39.513273 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerStarted","Data":"dbfaff915d0fcafb60574b0839b0d5966137d4dc64cbbc24ce53621c7b3c2439"} Mar 16 00:23:40 crc kubenswrapper[4983]: I0316 00:23:40.098886 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" path="/var/lib/kubelet/pods/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8/volumes" Mar 16 00:23:40 crc kubenswrapper[4983]: I0316 00:23:40.519118 4983 generic.go:334] "Generic (PLEG): container finished" podID="8045812f-963d-4b8f-ae8d-584addf74cae" containerID="e4ce240877f3e19ee8e22cd3f7f31da5a9601a66ff499b649538e63303f2f660" exitCode=0 Mar 16 00:23:40 crc kubenswrapper[4983]: I0316 00:23:40.519157 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerDied","Data":"e4ce240877f3e19ee8e22cd3f7f31da5a9601a66ff499b649538e63303f2f660"} Mar 16 00:23:41 crc kubenswrapper[4983]: I0316 00:23:41.529159 4983 generic.go:334] "Generic (PLEG): container finished" podID="8045812f-963d-4b8f-ae8d-584addf74cae" containerID="a502b68860c273118e04b350d0e5115354f1da1c30f1f1256f06eaef0f154670" exitCode=0 Mar 16 00:23:41 crc kubenswrapper[4983]: I0316 00:23:41.529346 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerDied","Data":"a502b68860c273118e04b350d0e5115354f1da1c30f1f1256f06eaef0f154670"} Mar 16 00:23:41 crc kubenswrapper[4983]: I0316 00:23:41.562321 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_8045812f-963d-4b8f-ae8d-584addf74cae/manage-dockerfile/0.log" Mar 16 00:23:42 crc kubenswrapper[4983]: I0316 00:23:42.541352 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerStarted","Data":"0471081bd69ebbcfb49f768098962d26cbc3cd949b019ccbe48697254a1b6d9c"} Mar 16 00:23:42 crc kubenswrapper[4983]: I0316 00:23:42.575542 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=4.575517022 podStartE2EDuration="4.575517022s" podCreationTimestamp="2026-03-16 00:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:23:42.573490548 +0000 UTC m=+1031.173589018" watchObservedRunningTime="2026-03-16 00:23:42.575517022 +0000 UTC m=+1031.175615472" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.138689 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560344-6jrbk"] Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.139863 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.141931 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.142051 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.142562 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.151229 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-6jrbk"] Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.156892 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlngt\" (UniqueName: \"kubernetes.io/projected/159f5145-349d-4018-a8d2-251363a76196-kube-api-access-hlngt\") pod \"auto-csr-approver-29560344-6jrbk\" (UID: \"159f5145-349d-4018-a8d2-251363a76196\") " pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.258609 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlngt\" (UniqueName: \"kubernetes.io/projected/159f5145-349d-4018-a8d2-251363a76196-kube-api-access-hlngt\") pod \"auto-csr-approver-29560344-6jrbk\" (UID: \"159f5145-349d-4018-a8d2-251363a76196\") " pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.279337 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlngt\" (UniqueName: \"kubernetes.io/projected/159f5145-349d-4018-a8d2-251363a76196-kube-api-access-hlngt\") pod \"auto-csr-approver-29560344-6jrbk\" (UID: \"159f5145-349d-4018-a8d2-251363a76196\") " pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.458065 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.645352 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-6jrbk"] Mar 16 00:24:00 crc kubenswrapper[4983]: W0316 00:24:00.650005 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod159f5145_349d_4018_a8d2_251363a76196.slice/crio-38b98613a41d0f5a9c0749d014ebd4c6c356efdefa45f481f223214edaece687 WatchSource:0}: Error finding container 38b98613a41d0f5a9c0749d014ebd4c6c356efdefa45f481f223214edaece687: Status 404 returned error can't find the container with id 38b98613a41d0f5a9c0749d014ebd4c6c356efdefa45f481f223214edaece687 Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.664458 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" event={"ID":"159f5145-349d-4018-a8d2-251363a76196","Type":"ContainerStarted","Data":"38b98613a41d0f5a9c0749d014ebd4c6c356efdefa45f481f223214edaece687"} Mar 16 00:24:02 crc kubenswrapper[4983]: I0316 00:24:02.686248 4983 generic.go:334] "Generic (PLEG): container finished" podID="159f5145-349d-4018-a8d2-251363a76196" containerID="12df34a0b20427d6c21f033a68c9272147f096ee9a9f4ed3b7d0b3054eb18184" exitCode=0 Mar 16 00:24:02 crc kubenswrapper[4983]: I0316 00:24:02.686312 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" event={"ID":"159f5145-349d-4018-a8d2-251363a76196","Type":"ContainerDied","Data":"12df34a0b20427d6c21f033a68c9272147f096ee9a9f4ed3b7d0b3054eb18184"} Mar 16 00:24:03 crc kubenswrapper[4983]: I0316 00:24:03.960237 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:04 crc kubenswrapper[4983]: I0316 00:24:04.006377 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlngt\" (UniqueName: \"kubernetes.io/projected/159f5145-349d-4018-a8d2-251363a76196-kube-api-access-hlngt\") pod \"159f5145-349d-4018-a8d2-251363a76196\" (UID: \"159f5145-349d-4018-a8d2-251363a76196\") " Mar 16 00:24:04 crc kubenswrapper[4983]: I0316 00:24:04.011043 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159f5145-349d-4018-a8d2-251363a76196-kube-api-access-hlngt" (OuterVolumeSpecName: "kube-api-access-hlngt") pod "159f5145-349d-4018-a8d2-251363a76196" (UID: "159f5145-349d-4018-a8d2-251363a76196"). InnerVolumeSpecName "kube-api-access-hlngt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:24:04 crc kubenswrapper[4983]: I0316 00:24:04.108117 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlngt\" (UniqueName: \"kubernetes.io/projected/159f5145-349d-4018-a8d2-251363a76196-kube-api-access-hlngt\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:04 crc kubenswrapper[4983]: I0316 00:24:04.700927 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" event={"ID":"159f5145-349d-4018-a8d2-251363a76196","Type":"ContainerDied","Data":"38b98613a41d0f5a9c0749d014ebd4c6c356efdefa45f481f223214edaece687"} Mar 16 00:24:04 crc kubenswrapper[4983]: I0316 00:24:04.701322 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38b98613a41d0f5a9c0749d014ebd4c6c356efdefa45f481f223214edaece687" Mar 16 00:24:04 crc kubenswrapper[4983]: I0316 00:24:04.701026 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:05 crc kubenswrapper[4983]: I0316 00:24:05.023799 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-2jkpl"] Mar 16 00:24:05 crc kubenswrapper[4983]: I0316 00:24:05.027910 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-2jkpl"] Mar 16 00:24:06 crc kubenswrapper[4983]: I0316 00:24:06.101365 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c6e333f-fadd-4c92-8db1-b9a923850fa0" path="/var/lib/kubelet/pods/1c6e333f-fadd-4c92-8db1-b9a923850fa0/volumes" Mar 16 00:24:06 crc kubenswrapper[4983]: I0316 00:24:06.490106 4983 scope.go:117] "RemoveContainer" containerID="e40b8ff2ea2fe096fb51ca5ef76f5eab03f687249bde3326f40974dcfd1c4938" Mar 16 00:24:41 crc kubenswrapper[4983]: I0316 00:24:41.923288 4983 generic.go:334] "Generic (PLEG): container finished" podID="8045812f-963d-4b8f-ae8d-584addf74cae" containerID="0471081bd69ebbcfb49f768098962d26cbc3cd949b019ccbe48697254a1b6d9c" exitCode=0 Mar 16 00:24:41 crc kubenswrapper[4983]: I0316 00:24:41.923518 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerDied","Data":"0471081bd69ebbcfb49f768098962d26cbc3cd949b019ccbe48697254a1b6d9c"} Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.225993 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251174 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-system-configs\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251281 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-buildcachedir\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251309 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-proxy-ca-bundles\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251342 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-run\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251376 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-pull\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251419 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-push\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251453 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-root\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251477 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-build-blob-cache\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251540 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-buildworkdir\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251566 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-ca-bundles\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251592 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-node-pullsecrets\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251614 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7gkl\" (UniqueName: \"kubernetes.io/projected/8045812f-963d-4b8f-ae8d-584addf74cae-kube-api-access-s7gkl\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.252134 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.252320 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.252624 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.253109 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.254851 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.257332 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.258360 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.259701 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.281949 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.299686 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8045812f-963d-4b8f-ae8d-584addf74cae-kube-api-access-s7gkl" (OuterVolumeSpecName: "kube-api-access-s7gkl") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "kube-api-access-s7gkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354498 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354541 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354555 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354569 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354582 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354594 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354605 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354617 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354627 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7gkl\" (UniqueName: \"kubernetes.io/projected/8045812f-963d-4b8f-ae8d-584addf74cae-kube-api-access-s7gkl\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354638 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.507393 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.558137 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.947917 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerDied","Data":"dbfaff915d0fcafb60574b0839b0d5966137d4dc64cbbc24ce53621c7b3c2439"} Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.948028 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbfaff915d0fcafb60574b0839b0d5966137d4dc64cbbc24ce53621c7b3c2439" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.948150 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:24:45 crc kubenswrapper[4983]: I0316 00:24:45.181988 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:45 crc kubenswrapper[4983]: I0316 00:24:45.188549 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.637821 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:24:47 crc kubenswrapper[4983]: E0316 00:24:47.638372 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="manage-dockerfile" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.638388 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="manage-dockerfile" Mar 16 00:24:47 crc kubenswrapper[4983]: E0316 00:24:47.638402 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159f5145-349d-4018-a8d2-251363a76196" containerName="oc" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.638411 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="159f5145-349d-4018-a8d2-251363a76196" containerName="oc" Mar 16 00:24:47 crc kubenswrapper[4983]: E0316 00:24:47.638429 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="docker-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.638439 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="docker-build" Mar 16 00:24:47 crc kubenswrapper[4983]: E0316 00:24:47.638458 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="git-clone" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.638466 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="git-clone" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.638619 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="159f5145-349d-4018-a8d2-251363a76196" containerName="oc" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.638640 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="docker-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.639443 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.642940 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.643068 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.643298 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.644159 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.651705 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722463 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722515 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-run\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722549 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4fdb\" (UniqueName: \"kubernetes.io/projected/d94841dc-28bc-4de0-a8c2-0f64f533a06a-kube-api-access-t4fdb\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722565 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-push\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722584 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildworkdir\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722604 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-system-configs\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722651 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-root\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722679 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722698 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildcachedir\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722717 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722732 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722874 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-pull\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823772 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823835 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildcachedir\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823863 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823886 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823923 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-pull\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823995 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824108 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824145 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-run\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823923 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildcachedir\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824291 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4fdb\" (UniqueName: \"kubernetes.io/projected/d94841dc-28bc-4de0-a8c2-0f64f533a06a-kube-api-access-t4fdb\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824305 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824329 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-push\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824354 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildworkdir\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824406 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-system-configs\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824474 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-root\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824987 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824994 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-root\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.825154 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildworkdir\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.825263 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-system-configs\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.825586 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.826401 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-run\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.830215 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-pull\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.830291 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-push\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.856123 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4fdb\" (UniqueName: \"kubernetes.io/projected/d94841dc-28bc-4de0-a8c2-0f64f533a06a-kube-api-access-t4fdb\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.964409 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:24:48 crc kubenswrapper[4983]: I0316 00:24:48.167249 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:24:48 crc kubenswrapper[4983]: I0316 00:24:48.987237 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"d94841dc-28bc-4de0-a8c2-0f64f533a06a","Type":"ContainerStarted","Data":"f4c1f0dfd1f9245a01177dd7e628b5c4473d772529ba0c664747e93e42f10dc2"} Mar 16 00:24:49 crc kubenswrapper[4983]: I0316 00:24:49.996226 4983 generic.go:334] "Generic (PLEG): container finished" podID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerID="70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792" exitCode=0 Mar 16 00:24:49 crc kubenswrapper[4983]: I0316 00:24:49.996273 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"d94841dc-28bc-4de0-a8c2-0f64f533a06a","Type":"ContainerDied","Data":"70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792"} Mar 16 00:24:51 crc kubenswrapper[4983]: I0316 00:24:51.006528 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"d94841dc-28bc-4de0-a8c2-0f64f533a06a","Type":"ContainerStarted","Data":"c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1"} Mar 16 00:24:51 crc kubenswrapper[4983]: I0316 00:24:51.042396 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=4.042373256 podStartE2EDuration="4.042373256s" podCreationTimestamp="2026-03-16 00:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:24:51.035573565 +0000 UTC m=+1099.635672005" watchObservedRunningTime="2026-03-16 00:24:51.042373256 +0000 UTC m=+1099.642471706" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.047639 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.048529 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerName="docker-build" containerID="cri-o://c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1" gracePeriod=30 Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.371691 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_d94841dc-28bc-4de0-a8c2-0f64f533a06a/docker-build/0.log" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.372251 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488040 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-node-pullsecrets\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488136 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-ca-bundles\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488160 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-root\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488153 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488199 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildcachedir\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488220 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-proxy-ca-bundles\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488240 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-push\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488263 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildworkdir\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488309 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-pull\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488333 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-run\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488433 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4fdb\" (UniqueName: \"kubernetes.io/projected/d94841dc-28bc-4de0-a8c2-0f64f533a06a-kube-api-access-t4fdb\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488512 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-blob-cache\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488555 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-system-configs\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488860 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.489292 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.489364 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.489476 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.489831 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.490697 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.491687 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.491712 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.491722 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.492176 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.492192 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.492201 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.492211 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.494441 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.494522 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.495003 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d94841dc-28bc-4de0-a8c2-0f64f533a06a-kube-api-access-t4fdb" (OuterVolumeSpecName: "kube-api-access-t4fdb") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "kube-api-access-t4fdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.587635 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.593136 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.593165 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4fdb\" (UniqueName: \"kubernetes.io/projected/d94841dc-28bc-4de0-a8c2-0f64f533a06a-kube-api-access-t4fdb\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.593178 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.593189 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.750450 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.794736 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.056081 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_d94841dc-28bc-4de0-a8c2-0f64f533a06a/docker-build/0.log" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.057178 4983 generic.go:334] "Generic (PLEG): container finished" podID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerID="c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1" exitCode=1 Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.057230 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"d94841dc-28bc-4de0-a8c2-0f64f533a06a","Type":"ContainerDied","Data":"c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1"} Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.057294 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"d94841dc-28bc-4de0-a8c2-0f64f533a06a","Type":"ContainerDied","Data":"f4c1f0dfd1f9245a01177dd7e628b5c4473d772529ba0c664747e93e42f10dc2"} Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.057327 4983 scope.go:117] "RemoveContainer" containerID="c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.057325 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.103933 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.111460 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.119494 4983 scope.go:117] "RemoveContainer" containerID="70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.142700 4983 scope.go:117] "RemoveContainer" containerID="c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1" Mar 16 00:24:59 crc kubenswrapper[4983]: E0316 00:24:59.145265 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1\": container with ID starting with c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1 not found: ID does not exist" containerID="c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.145309 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1"} err="failed to get container status \"c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1\": rpc error: code = NotFound desc = could not find container \"c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1\": container with ID starting with c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1 not found: ID does not exist" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.145337 4983 scope.go:117] "RemoveContainer" containerID="70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792" Mar 16 00:24:59 crc kubenswrapper[4983]: E0316 00:24:59.145708 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792\": container with ID starting with 70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792 not found: ID does not exist" containerID="70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.145745 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792"} err="failed to get container status \"70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792\": rpc error: code = NotFound desc = could not find container \"70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792\": container with ID starting with 70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792 not found: ID does not exist" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.642131 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 16 00:24:59 crc kubenswrapper[4983]: E0316 00:24:59.642833 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerName="manage-dockerfile" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.642869 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerName="manage-dockerfile" Mar 16 00:24:59 crc kubenswrapper[4983]: E0316 00:24:59.642892 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerName="docker-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.642905 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerName="docker-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.643089 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerName="docker-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.644591 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.647857 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.647939 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.647960 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.648452 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.681249 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712544 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildworkdir\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712613 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-pull\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712661 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712720 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-run\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712807 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-system-configs\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712846 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712905 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-root\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.713016 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-push\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.713086 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhjkg\" (UniqueName: \"kubernetes.io/projected/5b639c92-3fb8-4740-9242-9ced86fc4ad9-kube-api-access-mhjkg\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.713140 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.713193 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildcachedir\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.713297 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.813884 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjkg\" (UniqueName: \"kubernetes.io/projected/5b639c92-3fb8-4740-9242-9ced86fc4ad9-kube-api-access-mhjkg\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.813942 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.813971 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildcachedir\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814001 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814042 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildworkdir\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814062 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-pull\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814088 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814099 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildcachedir\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814122 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-run\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814131 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814158 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-system-configs\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814179 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814210 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-root\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814235 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-push\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814984 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-run\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.815383 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildworkdir\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.815398 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.815398 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-root\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.815519 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.815701 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.815814 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-system-configs\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.826448 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-pull\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.826450 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-push\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.830470 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhjkg\" (UniqueName: \"kubernetes.io/projected/5b639c92-3fb8-4740-9242-9ced86fc4ad9-kube-api-access-mhjkg\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.967668 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:25:00 crc kubenswrapper[4983]: I0316 00:25:00.100776 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" path="/var/lib/kubelet/pods/d94841dc-28bc-4de0-a8c2-0f64f533a06a/volumes" Mar 16 00:25:00 crc kubenswrapper[4983]: I0316 00:25:00.218962 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 16 00:25:01 crc kubenswrapper[4983]: I0316 00:25:01.096192 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerStarted","Data":"27479ee689ee6aea29760c33a15679a69d2edb724c4e782a5da2113d52bcc4b1"} Mar 16 00:25:01 crc kubenswrapper[4983]: I0316 00:25:01.096521 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerStarted","Data":"81b847e03066493ea991298e6d0dc07ee10475361fe583b28e677d8c4d775e55"} Mar 16 00:25:02 crc kubenswrapper[4983]: I0316 00:25:02.105108 4983 generic.go:334] "Generic (PLEG): container finished" podID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerID="27479ee689ee6aea29760c33a15679a69d2edb724c4e782a5da2113d52bcc4b1" exitCode=0 Mar 16 00:25:02 crc kubenswrapper[4983]: I0316 00:25:02.105169 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerDied","Data":"27479ee689ee6aea29760c33a15679a69d2edb724c4e782a5da2113d52bcc4b1"} Mar 16 00:25:03 crc kubenswrapper[4983]: I0316 00:25:03.112602 4983 generic.go:334] "Generic (PLEG): container finished" podID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerID="d151b31f17403c86e514fa536efb02ca9d67d7e4d646b4a4907bb6661ac8f39a" exitCode=0 Mar 16 00:25:03 crc kubenswrapper[4983]: I0316 00:25:03.112656 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerDied","Data":"d151b31f17403c86e514fa536efb02ca9d67d7e4d646b4a4907bb6661ac8f39a"} Mar 16 00:25:03 crc kubenswrapper[4983]: I0316 00:25:03.156856 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_5b639c92-3fb8-4740-9242-9ced86fc4ad9/manage-dockerfile/0.log" Mar 16 00:25:04 crc kubenswrapper[4983]: I0316 00:25:04.154622 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerStarted","Data":"09b1ea7b6afbc77d3aef59a4c6c789b34bc1851435c29298bef2043739f1f368"} Mar 16 00:25:04 crc kubenswrapper[4983]: I0316 00:25:04.190489 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=5.190467882 podStartE2EDuration="5.190467882s" podCreationTimestamp="2026-03-16 00:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:25:04.184478703 +0000 UTC m=+1112.784577143" watchObservedRunningTime="2026-03-16 00:25:04.190467882 +0000 UTC m=+1112.790566332" Mar 16 00:25:53 crc kubenswrapper[4983]: I0316 00:25:53.448871 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:25:53 crc kubenswrapper[4983]: I0316 00:25:53.449433 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.138091 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560346-xpbzq"] Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.143014 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.147282 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.147616 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.148543 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.154141 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-xpbzq"] Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.178008 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fws68\" (UniqueName: \"kubernetes.io/projected/3cb0fb90-2fa7-4376-8997-678868e0832a-kube-api-access-fws68\") pod \"auto-csr-approver-29560346-xpbzq\" (UID: \"3cb0fb90-2fa7-4376-8997-678868e0832a\") " pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.279614 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fws68\" (UniqueName: \"kubernetes.io/projected/3cb0fb90-2fa7-4376-8997-678868e0832a-kube-api-access-fws68\") pod \"auto-csr-approver-29560346-xpbzq\" (UID: \"3cb0fb90-2fa7-4376-8997-678868e0832a\") " pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.301235 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fws68\" (UniqueName: \"kubernetes.io/projected/3cb0fb90-2fa7-4376-8997-678868e0832a-kube-api-access-fws68\") pod \"auto-csr-approver-29560346-xpbzq\" (UID: \"3cb0fb90-2fa7-4376-8997-678868e0832a\") " pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.459277 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.738668 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-xpbzq"] Mar 16 00:26:00 crc kubenswrapper[4983]: W0316 00:26:00.753182 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cb0fb90_2fa7_4376_8997_678868e0832a.slice/crio-9eb6edb722ad22c1ef6bd170e5533d8c3f2fc40539d090a7f39fc9463b9833ff WatchSource:0}: Error finding container 9eb6edb722ad22c1ef6bd170e5533d8c3f2fc40539d090a7f39fc9463b9833ff: Status 404 returned error can't find the container with id 9eb6edb722ad22c1ef6bd170e5533d8c3f2fc40539d090a7f39fc9463b9833ff Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.801830 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" event={"ID":"3cb0fb90-2fa7-4376-8997-678868e0832a","Type":"ContainerStarted","Data":"9eb6edb722ad22c1ef6bd170e5533d8c3f2fc40539d090a7f39fc9463b9833ff"} Mar 16 00:26:02 crc kubenswrapper[4983]: I0316 00:26:02.814655 4983 generic.go:334] "Generic (PLEG): container finished" podID="3cb0fb90-2fa7-4376-8997-678868e0832a" containerID="2ff36755f0536b7a806a46f7132557ff7e7da3301af8403a26d90d34c8f6b2e8" exitCode=0 Mar 16 00:26:02 crc kubenswrapper[4983]: I0316 00:26:02.814713 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" event={"ID":"3cb0fb90-2fa7-4376-8997-678868e0832a","Type":"ContainerDied","Data":"2ff36755f0536b7a806a46f7132557ff7e7da3301af8403a26d90d34c8f6b2e8"} Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.030170 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.132229 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fws68\" (UniqueName: \"kubernetes.io/projected/3cb0fb90-2fa7-4376-8997-678868e0832a-kube-api-access-fws68\") pod \"3cb0fb90-2fa7-4376-8997-678868e0832a\" (UID: \"3cb0fb90-2fa7-4376-8997-678868e0832a\") " Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.137456 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb0fb90-2fa7-4376-8997-678868e0832a-kube-api-access-fws68" (OuterVolumeSpecName: "kube-api-access-fws68") pod "3cb0fb90-2fa7-4376-8997-678868e0832a" (UID: "3cb0fb90-2fa7-4376-8997-678868e0832a"). InnerVolumeSpecName "kube-api-access-fws68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.233603 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fws68\" (UniqueName: \"kubernetes.io/projected/3cb0fb90-2fa7-4376-8997-678868e0832a-kube-api-access-fws68\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.827143 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" event={"ID":"3cb0fb90-2fa7-4376-8997-678868e0832a","Type":"ContainerDied","Data":"9eb6edb722ad22c1ef6bd170e5533d8c3f2fc40539d090a7f39fc9463b9833ff"} Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.827178 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eb6edb722ad22c1ef6bd170e5533d8c3f2fc40539d090a7f39fc9463b9833ff" Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.827195 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:05 crc kubenswrapper[4983]: I0316 00:26:05.085787 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-664mq"] Mar 16 00:26:05 crc kubenswrapper[4983]: I0316 00:26:05.092888 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-664mq"] Mar 16 00:26:06 crc kubenswrapper[4983]: I0316 00:26:06.110205 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3356aa9a-4f16-4602-97b0-1118f7e55776" path="/var/lib/kubelet/pods/3356aa9a-4f16-4602-97b0-1118f7e55776/volumes" Mar 16 00:26:06 crc kubenswrapper[4983]: I0316 00:26:06.592613 4983 scope.go:117] "RemoveContainer" containerID="ec962f764e58dc18fb35bd2bf73250ec727cbdfcfdec0a585462238f6e2032c9" Mar 16 00:26:23 crc kubenswrapper[4983]: I0316 00:26:23.448430 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:26:23 crc kubenswrapper[4983]: I0316 00:26:23.449100 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:26:53 crc kubenswrapper[4983]: I0316 00:26:53.448481 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:26:53 crc kubenswrapper[4983]: I0316 00:26:53.449015 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:26:53 crc kubenswrapper[4983]: I0316 00:26:53.449062 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:26:53 crc kubenswrapper[4983]: I0316 00:26:53.449686 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4952feacc34350796d6119e57c5c2963c6c739a2d4a6116de514f57eada3dedf"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:26:53 crc kubenswrapper[4983]: I0316 00:26:53.449738 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://4952feacc34350796d6119e57c5c2963c6c739a2d4a6116de514f57eada3dedf" gracePeriod=600 Mar 16 00:26:54 crc kubenswrapper[4983]: I0316 00:26:54.181822 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="4952feacc34350796d6119e57c5c2963c6c739a2d4a6116de514f57eada3dedf" exitCode=0 Mar 16 00:26:54 crc kubenswrapper[4983]: I0316 00:26:54.181893 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"4952feacc34350796d6119e57c5c2963c6c739a2d4a6116de514f57eada3dedf"} Mar 16 00:26:54 crc kubenswrapper[4983]: I0316 00:26:54.182348 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"5ccb6fbab37bc5f699825790ea94936a8112c1dca902322ca17a08342dea1350"} Mar 16 00:26:54 crc kubenswrapper[4983]: I0316 00:26:54.182377 4983 scope.go:117] "RemoveContainer" containerID="46c022992a1c1aeeb47c6d405474573b981c1ce7a0658e8eab3f5cf112a6afc5" Mar 16 00:27:57 crc kubenswrapper[4983]: I0316 00:27:57.591354 4983 generic.go:334] "Generic (PLEG): container finished" podID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerID="09b1ea7b6afbc77d3aef59a4c6c789b34bc1851435c29298bef2043739f1f368" exitCode=0 Mar 16 00:27:57 crc kubenswrapper[4983]: I0316 00:27:57.591438 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerDied","Data":"09b1ea7b6afbc77d3aef59a4c6c789b34bc1851435c29298bef2043739f1f368"} Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.856245 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898650 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-root\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898731 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-run\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898802 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildcachedir\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898829 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-push\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898850 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-proxy-ca-bundles\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898873 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-ca-bundles\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898888 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-node-pullsecrets\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898908 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildworkdir\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898923 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-blob-cache\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898941 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhjkg\" (UniqueName: \"kubernetes.io/projected/5b639c92-3fb8-4740-9242-9ced86fc4ad9-kube-api-access-mhjkg\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898959 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-system-configs\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898976 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-pull\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.900044 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.900552 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.900952 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.901325 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.901778 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.902104 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.905355 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.905865 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b639c92-3fb8-4740-9242-9ced86fc4ad9-kube-api-access-mhjkg" (OuterVolumeSpecName: "kube-api-access-mhjkg") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "kube-api-access-mhjkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.906877 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.913638 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.000573 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.000882 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.000955 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001014 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001071 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001127 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001183 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001250 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhjkg\" (UniqueName: \"kubernetes.io/projected/5b639c92-3fb8-4740-9242-9ced86fc4ad9-kube-api-access-mhjkg\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001309 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001366 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.249520 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.305546 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.607326 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerDied","Data":"81b847e03066493ea991298e6d0dc07ee10475361fe583b28e677d8c4d775e55"} Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.607656 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81b847e03066493ea991298e6d0dc07ee10475361fe583b28e677d8c4d775e55" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.607396 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144007 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560348-bfz7x"] Mar 16 00:28:00 crc kubenswrapper[4983]: E0316 00:28:00.144303 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb0fb90-2fa7-4376-8997-678868e0832a" containerName="oc" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144317 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb0fb90-2fa7-4376-8997-678868e0832a" containerName="oc" Mar 16 00:28:00 crc kubenswrapper[4983]: E0316 00:28:00.144339 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="git-clone" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144346 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="git-clone" Mar 16 00:28:00 crc kubenswrapper[4983]: E0316 00:28:00.144353 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="manage-dockerfile" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144361 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="manage-dockerfile" Mar 16 00:28:00 crc kubenswrapper[4983]: E0316 00:28:00.144377 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="docker-build" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144384 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="docker-build" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144489 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb0fb90-2fa7-4376-8997-678868e0832a" containerName="oc" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144499 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="docker-build" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.145284 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.147739 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.147896 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.148072 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.149233 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-bfz7x"] Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.216401 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm2z7\" (UniqueName: \"kubernetes.io/projected/ee9b49cf-d10f-4047-aa75-b89a01652d64-kube-api-access-tm2z7\") pod \"auto-csr-approver-29560348-bfz7x\" (UID: \"ee9b49cf-d10f-4047-aa75-b89a01652d64\") " pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.317339 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm2z7\" (UniqueName: \"kubernetes.io/projected/ee9b49cf-d10f-4047-aa75-b89a01652d64-kube-api-access-tm2z7\") pod \"auto-csr-approver-29560348-bfz7x\" (UID: \"ee9b49cf-d10f-4047-aa75-b89a01652d64\") " pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.332450 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm2z7\" (UniqueName: \"kubernetes.io/projected/ee9b49cf-d10f-4047-aa75-b89a01652d64-kube-api-access-tm2z7\") pod \"auto-csr-approver-29560348-bfz7x\" (UID: \"ee9b49cf-d10f-4047-aa75-b89a01652d64\") " pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.475017 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.657404 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-bfz7x"] Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.662879 4983 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:28:01 crc kubenswrapper[4983]: I0316 00:28:01.387310 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:28:01 crc kubenswrapper[4983]: I0316 00:28:01.445108 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:01 crc kubenswrapper[4983]: I0316 00:28:01.621198 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" event={"ID":"ee9b49cf-d10f-4047-aa75-b89a01652d64","Type":"ContainerStarted","Data":"a1cb8041e3041462e906e27b823bce28977cc94d8a26168231ac6b952be270f5"} Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.469816 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.472018 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.478295 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.478876 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.479024 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.479104 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.494647 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572285 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572345 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-pull\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572418 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572451 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572483 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572530 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572571 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-push\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572730 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572814 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572849 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572888 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l5lg\" (UniqueName: \"kubernetes.io/projected/3c28cfcf-6ddc-4eaa-956a-f42746839382-kube-api-access-8l5lg\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572933 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.634895 4983 generic.go:334] "Generic (PLEG): container finished" podID="ee9b49cf-d10f-4047-aa75-b89a01652d64" containerID="78e4a4ba6d978cbff10ef4b981fae6b3ca9ed9e86332628a7ba7f28e90fd7ea7" exitCode=0 Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.634948 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" event={"ID":"ee9b49cf-d10f-4047-aa75-b89a01652d64","Type":"ContainerDied","Data":"78e4a4ba6d978cbff10ef4b981fae6b3ca9ed9e86332628a7ba7f28e90fd7ea7"} Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.674220 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.674303 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.674327 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.675178 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.675704 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.675785 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.675864 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.676653 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.676683 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-push\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.676714 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.676736 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.677079 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.677126 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l5lg\" (UniqueName: \"kubernetes.io/projected/3c28cfcf-6ddc-4eaa-956a-f42746839382-kube-api-access-8l5lg\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.677186 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.677254 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-pull\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.677276 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.677331 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.678387 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.678893 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.678949 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.696359 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-push\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.697482 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.698092 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-pull\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.701057 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l5lg\" (UniqueName: \"kubernetes.io/projected/3c28cfcf-6ddc-4eaa-956a-f42746839382-kube-api-access-8l5lg\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.787971 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:04 crc kubenswrapper[4983]: I0316 00:28:04.243039 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:28:04 crc kubenswrapper[4983]: I0316 00:28:04.642510 4983 generic.go:334] "Generic (PLEG): container finished" podID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerID="e7a4f48409b5b7c54ae90135f67eb66bb8944343dc341edc9f4aefa5c3b5777f" exitCode=0 Mar 16 00:28:04 crc kubenswrapper[4983]: I0316 00:28:04.643268 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"3c28cfcf-6ddc-4eaa-956a-f42746839382","Type":"ContainerDied","Data":"e7a4f48409b5b7c54ae90135f67eb66bb8944343dc341edc9f4aefa5c3b5777f"} Mar 16 00:28:04 crc kubenswrapper[4983]: I0316 00:28:04.643293 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"3c28cfcf-6ddc-4eaa-956a-f42746839382","Type":"ContainerStarted","Data":"0210089ac19e5c094dcc7240af8c61cd94f591ab00ebf2020e2d3bf726eca0d6"} Mar 16 00:28:04 crc kubenswrapper[4983]: I0316 00:28:04.912246 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:04 crc kubenswrapper[4983]: I0316 00:28:04.997269 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm2z7\" (UniqueName: \"kubernetes.io/projected/ee9b49cf-d10f-4047-aa75-b89a01652d64-kube-api-access-tm2z7\") pod \"ee9b49cf-d10f-4047-aa75-b89a01652d64\" (UID: \"ee9b49cf-d10f-4047-aa75-b89a01652d64\") " Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.002354 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee9b49cf-d10f-4047-aa75-b89a01652d64-kube-api-access-tm2z7" (OuterVolumeSpecName: "kube-api-access-tm2z7") pod "ee9b49cf-d10f-4047-aa75-b89a01652d64" (UID: "ee9b49cf-d10f-4047-aa75-b89a01652d64"). InnerVolumeSpecName "kube-api-access-tm2z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.098817 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm2z7\" (UniqueName: \"kubernetes.io/projected/ee9b49cf-d10f-4047-aa75-b89a01652d64-kube-api-access-tm2z7\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.651881 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"3c28cfcf-6ddc-4eaa-956a-f42746839382","Type":"ContainerStarted","Data":"591b456456f261ac0f2b9814ab85dfd85135db5fb073dd674dfeacb880c9917a"} Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.654272 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" event={"ID":"ee9b49cf-d10f-4047-aa75-b89a01652d64","Type":"ContainerDied","Data":"a1cb8041e3041462e906e27b823bce28977cc94d8a26168231ac6b952be270f5"} Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.654324 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1cb8041e3041462e906e27b823bce28977cc94d8a26168231ac6b952be270f5" Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.654338 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.683423 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=2.6834096130000002 podStartE2EDuration="2.683409613s" podCreationTimestamp="2026-03-16 00:28:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:28:05.676981862 +0000 UTC m=+1294.277080292" watchObservedRunningTime="2026-03-16 00:28:05.683409613 +0000 UTC m=+1294.283508033" Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.985893 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-544h5"] Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.995198 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-544h5"] Mar 16 00:28:06 crc kubenswrapper[4983]: I0316 00:28:06.104998 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0d707a0-1b40-4364-9a61-cde76e2c80a1" path="/var/lib/kubelet/pods/d0d707a0-1b40-4364-9a61-cde76e2c80a1/volumes" Mar 16 00:28:06 crc kubenswrapper[4983]: I0316 00:28:06.696653 4983 scope.go:117] "RemoveContainer" containerID="b6146a6dfae8df822feda4cd12d6f532571e6f993bc0ef397b108d23a0fa9361" Mar 16 00:28:11 crc kubenswrapper[4983]: I0316 00:28:11.694999 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_3c28cfcf-6ddc-4eaa-956a-f42746839382/docker-build/0.log" Mar 16 00:28:11 crc kubenswrapper[4983]: I0316 00:28:11.696466 4983 generic.go:334] "Generic (PLEG): container finished" podID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerID="591b456456f261ac0f2b9814ab85dfd85135db5fb073dd674dfeacb880c9917a" exitCode=1 Mar 16 00:28:11 crc kubenswrapper[4983]: I0316 00:28:11.696500 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"3c28cfcf-6ddc-4eaa-956a-f42746839382","Type":"ContainerDied","Data":"591b456456f261ac0f2b9814ab85dfd85135db5fb073dd674dfeacb880c9917a"} Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.039060 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_3c28cfcf-6ddc-4eaa-956a-f42746839382/docker-build/0.log" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.041221 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116621 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildcachedir\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116687 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-node-pullsecrets\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116720 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-push\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116744 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-run\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116808 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-pull\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116836 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-proxy-ca-bundles\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116863 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-system-configs\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116897 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-root\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116882 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116923 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-ca-bundles\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116957 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildworkdir\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.117006 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l5lg\" (UniqueName: \"kubernetes.io/projected/3c28cfcf-6ddc-4eaa-956a-f42746839382-kube-api-access-8l5lg\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.117024 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-blob-cache\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.117265 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.117686 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.117171 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.117970 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.118166 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.118558 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.122010 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.122114 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.122515 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.122579 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c28cfcf-6ddc-4eaa-956a-f42746839382-kube-api-access-8l5lg" (OuterVolumeSpecName: "kube-api-access-8l5lg") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "kube-api-access-8l5lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.194552 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218384 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218434 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218453 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218472 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218492 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218511 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l5lg\" (UniqueName: \"kubernetes.io/projected/3c28cfcf-6ddc-4eaa-956a-f42746839382-kube-api-access-8l5lg\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218527 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218546 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218562 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218583 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.498209 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.523163 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.712519 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_3c28cfcf-6ddc-4eaa-956a-f42746839382/docker-build/0.log" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.713203 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"3c28cfcf-6ddc-4eaa-956a-f42746839382","Type":"ContainerDied","Data":"0210089ac19e5c094dcc7240af8c61cd94f591ab00ebf2020e2d3bf726eca0d6"} Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.713261 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0210089ac19e5c094dcc7240af8c61cd94f591ab00ebf2020e2d3bf726eca0d6" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.713311 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.850469 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.856219 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:28:14 crc kubenswrapper[4983]: I0316 00:28:14.104405 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c28cfcf-6ddc-4eaa-956a-f42746839382" path="/var/lib/kubelet/pods/3c28cfcf-6ddc-4eaa-956a-f42746839382/volumes" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.519914 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 16 00:28:15 crc kubenswrapper[4983]: E0316 00:28:15.520178 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerName="docker-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.520194 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerName="docker-build" Mar 16 00:28:15 crc kubenswrapper[4983]: E0316 00:28:15.520211 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerName="manage-dockerfile" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.520219 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerName="manage-dockerfile" Mar 16 00:28:15 crc kubenswrapper[4983]: E0316 00:28:15.520234 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9b49cf-d10f-4047-aa75-b89a01652d64" containerName="oc" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.520242 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9b49cf-d10f-4047-aa75-b89a01652d64" containerName="oc" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.520377 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerName="docker-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.520391 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee9b49cf-d10f-4047-aa75-b89a01652d64" containerName="oc" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.521424 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.524170 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.524632 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.524942 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.527636 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.542408 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656117 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656166 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656198 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdqfb\" (UniqueName: \"kubernetes.io/projected/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-kube-api-access-zdqfb\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656216 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656305 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656378 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656415 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-pull\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656485 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656599 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656659 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656683 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656715 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-push\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758349 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758418 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758456 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdqfb\" (UniqueName: \"kubernetes.io/projected/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-kube-api-access-zdqfb\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758476 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758496 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758526 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758550 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-pull\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758559 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758588 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758627 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758660 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758687 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758709 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-push\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758798 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.759115 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.759141 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.759270 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.759394 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.759424 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.759518 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.760182 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.763219 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-push\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.771385 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-pull\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.777101 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdqfb\" (UniqueName: \"kubernetes.io/projected/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-kube-api-access-zdqfb\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.845145 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:16 crc kubenswrapper[4983]: I0316 00:28:16.303216 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 16 00:28:16 crc kubenswrapper[4983]: I0316 00:28:16.733112 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerStarted","Data":"4b7e8f0b0262d13b5d3dad043009038fc9741d48b180835616abe0b9cb4b0b85"} Mar 16 00:28:16 crc kubenswrapper[4983]: I0316 00:28:16.733460 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerStarted","Data":"1172d46c4823802088437b057d123769d09a25da1fff7fc5d7b9e421e92ae6f5"} Mar 16 00:28:17 crc kubenswrapper[4983]: I0316 00:28:17.741018 4983 generic.go:334] "Generic (PLEG): container finished" podID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerID="4b7e8f0b0262d13b5d3dad043009038fc9741d48b180835616abe0b9cb4b0b85" exitCode=0 Mar 16 00:28:17 crc kubenswrapper[4983]: I0316 00:28:17.741081 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerDied","Data":"4b7e8f0b0262d13b5d3dad043009038fc9741d48b180835616abe0b9cb4b0b85"} Mar 16 00:28:18 crc kubenswrapper[4983]: I0316 00:28:18.747690 4983 generic.go:334] "Generic (PLEG): container finished" podID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerID="94ea866889c4c93c4d2c2bf288edba184639a401760736267e888a00e8c78537" exitCode=0 Mar 16 00:28:18 crc kubenswrapper[4983]: I0316 00:28:18.747781 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerDied","Data":"94ea866889c4c93c4d2c2bf288edba184639a401760736267e888a00e8c78537"} Mar 16 00:28:18 crc kubenswrapper[4983]: I0316 00:28:18.799006 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_bf3cda5e-7ab3-44d7-baa0-d98b65d0d759/manage-dockerfile/0.log" Mar 16 00:28:19 crc kubenswrapper[4983]: I0316 00:28:19.758998 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerStarted","Data":"3d928fbc21c2d15074f4f972b682df33246ef2426138d87f9d0046e2e737dafb"} Mar 16 00:28:53 crc kubenswrapper[4983]: I0316 00:28:53.448502 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:28:53 crc kubenswrapper[4983]: I0316 00:28:53.449385 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:29:03 crc kubenswrapper[4983]: I0316 00:29:03.067229 4983 generic.go:334] "Generic (PLEG): container finished" podID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerID="3d928fbc21c2d15074f4f972b682df33246ef2426138d87f9d0046e2e737dafb" exitCode=0 Mar 16 00:29:03 crc kubenswrapper[4983]: I0316 00:29:03.067317 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerDied","Data":"3d928fbc21c2d15074f4f972b682df33246ef2426138d87f9d0046e2e737dafb"} Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.352280 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.467848 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-node-pullsecrets\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468146 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-system-configs\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468183 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-proxy-ca-bundles\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468270 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-blob-cache\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468289 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildworkdir\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468303 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-push\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468324 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-run\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468357 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-root\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468373 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-pull\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468389 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-ca-bundles\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468406 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdqfb\" (UniqueName: \"kubernetes.io/projected/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-kube-api-access-zdqfb\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468441 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildcachedir\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468661 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468687 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.469748 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.469937 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.470280 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.472550 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.477092 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.477111 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-kube-api-access-zdqfb" (OuterVolumeSpecName: "kube-api-access-zdqfb") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "kube-api-access-zdqfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.477187 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.478646 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569391 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569426 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569470 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569483 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569494 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569505 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdqfb\" (UniqueName: \"kubernetes.io/projected/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-kube-api-access-zdqfb\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569516 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569529 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569539 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569550 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.579385 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.670936 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:05 crc kubenswrapper[4983]: I0316 00:29:05.084563 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerDied","Data":"1172d46c4823802088437b057d123769d09a25da1fff7fc5d7b9e421e92ae6f5"} Mar 16 00:29:05 crc kubenswrapper[4983]: I0316 00:29:05.084605 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:29:05 crc kubenswrapper[4983]: I0316 00:29:05.084614 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1172d46c4823802088437b057d123769d09a25da1fff7fc5d7b9e421e92ae6f5" Mar 16 00:29:05 crc kubenswrapper[4983]: I0316 00:29:05.223709 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:05 crc kubenswrapper[4983]: I0316 00:29:05.280359 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.457864 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:29:08 crc kubenswrapper[4983]: E0316 00:29:08.458435 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="git-clone" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.458451 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="git-clone" Mar 16 00:29:08 crc kubenswrapper[4983]: E0316 00:29:08.458461 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="manage-dockerfile" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.458468 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="manage-dockerfile" Mar 16 00:29:08 crc kubenswrapper[4983]: E0316 00:29:08.458496 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="docker-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.458506 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="docker-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.458621 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="docker-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.459414 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.461143 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.461183 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.461187 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.462763 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.468741 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521580 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521633 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521655 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521768 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521821 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521859 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521878 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521897 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521916 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521949 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521971 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.522026 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbjsj\" (UniqueName: \"kubernetes.io/projected/48198167-8197-43b5-847b-6573fc24f312-kube-api-access-wbjsj\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622330 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622380 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622397 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622431 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbjsj\" (UniqueName: \"kubernetes.io/projected/48198167-8197-43b5-847b-6573fc24f312-kube-api-access-wbjsj\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622455 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622473 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622487 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622506 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622526 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622552 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622571 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622590 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622671 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.623345 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.624076 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.624127 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.624306 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.624548 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.625239 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.625662 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.625988 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.629028 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.629483 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.641837 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbjsj\" (UniqueName: \"kubernetes.io/projected/48198167-8197-43b5-847b-6573fc24f312-kube-api-access-wbjsj\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.775352 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.944780 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:29:09 crc kubenswrapper[4983]: I0316 00:29:09.109172 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"48198167-8197-43b5-847b-6573fc24f312","Type":"ContainerStarted","Data":"e0f3e3718874685653e30f81f0245af34cc90402115bbca57e3bcd0a3130386f"} Mar 16 00:29:10 crc kubenswrapper[4983]: I0316 00:29:10.116370 4983 generic.go:334] "Generic (PLEG): container finished" podID="48198167-8197-43b5-847b-6573fc24f312" containerID="7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae" exitCode=0 Mar 16 00:29:10 crc kubenswrapper[4983]: I0316 00:29:10.116463 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"48198167-8197-43b5-847b-6573fc24f312","Type":"ContainerDied","Data":"7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae"} Mar 16 00:29:11 crc kubenswrapper[4983]: I0316 00:29:11.123534 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"48198167-8197-43b5-847b-6573fc24f312","Type":"ContainerStarted","Data":"b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970"} Mar 16 00:29:11 crc kubenswrapper[4983]: I0316 00:29:11.146298 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.146280803 podStartE2EDuration="3.146280803s" podCreationTimestamp="2026-03-16 00:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:29:11.141545858 +0000 UTC m=+1359.741644278" watchObservedRunningTime="2026-03-16 00:29:11.146280803 +0000 UTC m=+1359.746379233" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.258935 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.259836 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="48198167-8197-43b5-847b-6573fc24f312" containerName="docker-build" containerID="cri-o://b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970" gracePeriod=30 Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.685444 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_48198167-8197-43b5-847b-6573fc24f312/docker-build/0.log" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.686461 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780433 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-pull\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780492 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-buildworkdir\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780528 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-run\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780547 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-root\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780579 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbjsj\" (UniqueName: \"kubernetes.io/projected/48198167-8197-43b5-847b-6573fc24f312-kube-api-access-wbjsj\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780602 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-build-blob-cache\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780623 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-proxy-ca-bundles\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780637 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-node-pullsecrets\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780668 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-buildcachedir\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780731 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-push\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780788 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-system-configs\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780807 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-ca-bundles\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780850 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780977 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.781054 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.781068 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.783744 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.783901 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.783994 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.783982 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.784266 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.789237 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48198167-8197-43b5-847b-6573fc24f312-kube-api-access-wbjsj" (OuterVolumeSpecName: "kube-api-access-wbjsj") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "kube-api-access-wbjsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.789272 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.789535 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.854020 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.882923 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.882959 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.882970 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.882980 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.882992 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.883004 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.883013 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbjsj\" (UniqueName: \"kubernetes.io/projected/48198167-8197-43b5-847b-6573fc24f312-kube-api-access-wbjsj\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.883021 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.883029 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.154880 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.180004 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_48198167-8197-43b5-847b-6573fc24f312/docker-build/0.log" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.180740 4983 generic.go:334] "Generic (PLEG): container finished" podID="48198167-8197-43b5-847b-6573fc24f312" containerID="b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970" exitCode=1 Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.180840 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"48198167-8197-43b5-847b-6573fc24f312","Type":"ContainerDied","Data":"b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970"} Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.180886 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"48198167-8197-43b5-847b-6573fc24f312","Type":"ContainerDied","Data":"e0f3e3718874685653e30f81f0245af34cc90402115bbca57e3bcd0a3130386f"} Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.180919 4983 scope.go:117] "RemoveContainer" containerID="b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.181019 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.187995 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.204002 4983 scope.go:117] "RemoveContainer" containerID="7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.227660 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.231934 4983 scope.go:117] "RemoveContainer" containerID="b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970" Mar 16 00:29:20 crc kubenswrapper[4983]: E0316 00:29:20.232481 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970\": container with ID starting with b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970 not found: ID does not exist" containerID="b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.232548 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970"} err="failed to get container status \"b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970\": rpc error: code = NotFound desc = could not find container \"b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970\": container with ID starting with b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970 not found: ID does not exist" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.232587 4983 scope.go:117] "RemoveContainer" containerID="7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae" Mar 16 00:29:20 crc kubenswrapper[4983]: E0316 00:29:20.233208 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae\": container with ID starting with 7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae not found: ID does not exist" containerID="7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.233826 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae"} err="failed to get container status \"7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae\": rpc error: code = NotFound desc = could not find container \"7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae\": container with ID starting with 7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae not found: ID does not exist" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.234013 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.890615 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 16 00:29:20 crc kubenswrapper[4983]: E0316 00:29:20.890962 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48198167-8197-43b5-847b-6573fc24f312" containerName="manage-dockerfile" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.890979 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="48198167-8197-43b5-847b-6573fc24f312" containerName="manage-dockerfile" Mar 16 00:29:20 crc kubenswrapper[4983]: E0316 00:29:20.891008 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48198167-8197-43b5-847b-6573fc24f312" containerName="docker-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.891015 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="48198167-8197-43b5-847b-6573fc24f312" containerName="docker-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.891157 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="48198167-8197-43b5-847b-6573fc24f312" containerName="docker-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.892447 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.894285 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.894704 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.895327 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.895355 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.914917 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999642 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999700 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999766 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999794 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999882 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlgqf\" (UniqueName: \"kubernetes.io/projected/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-kube-api-access-dlgqf\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999910 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999938 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.000097 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.000159 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.000203 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.000239 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.000291 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101391 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101440 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101486 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlgqf\" (UniqueName: \"kubernetes.io/projected/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-kube-api-access-dlgqf\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101508 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101633 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101783 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101788 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101918 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101960 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101956 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101993 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102067 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102100 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102124 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102266 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102336 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102379 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102541 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102612 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102708 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102996 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.105077 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.114287 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.119262 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlgqf\" (UniqueName: \"kubernetes.io/projected/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-kube-api-access-dlgqf\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.213462 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.399781 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 16 00:29:22 crc kubenswrapper[4983]: I0316 00:29:22.101918 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48198167-8197-43b5-847b-6573fc24f312" path="/var/lib/kubelet/pods/48198167-8197-43b5-847b-6573fc24f312/volumes" Mar 16 00:29:22 crc kubenswrapper[4983]: I0316 00:29:22.195824 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerStarted","Data":"5ba424032e01a5aec1dafe53a9a7d6f1b7244d21b8fbf332aa8bc6d19c1c8863"} Mar 16 00:29:22 crc kubenswrapper[4983]: I0316 00:29:22.195872 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerStarted","Data":"dc3a9ca53d1d789e35b74d5f0511c33239a8145d878bf5432c485cec4d0b52c4"} Mar 16 00:29:23 crc kubenswrapper[4983]: I0316 00:29:23.206454 4983 generic.go:334] "Generic (PLEG): container finished" podID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerID="5ba424032e01a5aec1dafe53a9a7d6f1b7244d21b8fbf332aa8bc6d19c1c8863" exitCode=0 Mar 16 00:29:23 crc kubenswrapper[4983]: I0316 00:29:23.206875 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerDied","Data":"5ba424032e01a5aec1dafe53a9a7d6f1b7244d21b8fbf332aa8bc6d19c1c8863"} Mar 16 00:29:23 crc kubenswrapper[4983]: I0316 00:29:23.448694 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:29:23 crc kubenswrapper[4983]: I0316 00:29:23.448788 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:29:24 crc kubenswrapper[4983]: I0316 00:29:24.213563 4983 generic.go:334] "Generic (PLEG): container finished" podID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerID="c57595e008cb9889dca6c1a1c59e02696afe2ba9673c999bed193761ef0ca2b8" exitCode=0 Mar 16 00:29:24 crc kubenswrapper[4983]: I0316 00:29:24.213608 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerDied","Data":"c57595e008cb9889dca6c1a1c59e02696afe2ba9673c999bed193761ef0ca2b8"} Mar 16 00:29:24 crc kubenswrapper[4983]: I0316 00:29:24.253268 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_de1c6f9c-f134-479d-a5ae-5ab93b30b2e3/manage-dockerfile/0.log" Mar 16 00:29:25 crc kubenswrapper[4983]: I0316 00:29:25.222487 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerStarted","Data":"5f7cbddba7e24ee372140ec3e6d1283660f1e04ffafec527fdafc57d3a279e56"} Mar 16 00:29:25 crc kubenswrapper[4983]: I0316 00:29:25.250426 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=5.250411828 podStartE2EDuration="5.250411828s" podCreationTimestamp="2026-03-16 00:29:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:29:25.248541608 +0000 UTC m=+1373.848640048" watchObservedRunningTime="2026-03-16 00:29:25.250411828 +0000 UTC m=+1373.850510248" Mar 16 00:29:53 crc kubenswrapper[4983]: I0316 00:29:53.448336 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:29:53 crc kubenswrapper[4983]: I0316 00:29:53.448899 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:29:53 crc kubenswrapper[4983]: I0316 00:29:53.448948 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:29:53 crc kubenswrapper[4983]: I0316 00:29:53.449609 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ccb6fbab37bc5f699825790ea94936a8112c1dca902322ca17a08342dea1350"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:29:53 crc kubenswrapper[4983]: I0316 00:29:53.449671 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://5ccb6fbab37bc5f699825790ea94936a8112c1dca902322ca17a08342dea1350" gracePeriod=600 Mar 16 00:29:54 crc kubenswrapper[4983]: I0316 00:29:54.437517 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="5ccb6fbab37bc5f699825790ea94936a8112c1dca902322ca17a08342dea1350" exitCode=0 Mar 16 00:29:54 crc kubenswrapper[4983]: I0316 00:29:54.437587 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"5ccb6fbab37bc5f699825790ea94936a8112c1dca902322ca17a08342dea1350"} Mar 16 00:29:54 crc kubenswrapper[4983]: I0316 00:29:54.437916 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0"} Mar 16 00:29:54 crc kubenswrapper[4983]: I0316 00:29:54.437943 4983 scope.go:117] "RemoveContainer" containerID="4952feacc34350796d6119e57c5c2963c6c739a2d4a6116de514f57eada3dedf" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.160025 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560350-spjzd"] Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.161738 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.164949 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.165614 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.165837 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.171146 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h"] Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.172371 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.179826 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.180019 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.187857 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-spjzd"] Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.192416 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h"] Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.262018 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gm9n\" (UniqueName: \"kubernetes.io/projected/6c811eb7-248a-49ed-be14-95285f2c4400-kube-api-access-7gm9n\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.262083 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c811eb7-248a-49ed-be14-95285f2c4400-config-volume\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.262136 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrwz6\" (UniqueName: \"kubernetes.io/projected/57c14e51-5c0b-467c-ba79-ac6f39239445-kube-api-access-rrwz6\") pod \"auto-csr-approver-29560350-spjzd\" (UID: \"57c14e51-5c0b-467c-ba79-ac6f39239445\") " pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.262171 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c811eb7-248a-49ed-be14-95285f2c4400-secret-volume\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.362938 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrwz6\" (UniqueName: \"kubernetes.io/projected/57c14e51-5c0b-467c-ba79-ac6f39239445-kube-api-access-rrwz6\") pod \"auto-csr-approver-29560350-spjzd\" (UID: \"57c14e51-5c0b-467c-ba79-ac6f39239445\") " pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.363000 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c811eb7-248a-49ed-be14-95285f2c4400-secret-volume\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.363088 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gm9n\" (UniqueName: \"kubernetes.io/projected/6c811eb7-248a-49ed-be14-95285f2c4400-kube-api-access-7gm9n\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.363118 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c811eb7-248a-49ed-be14-95285f2c4400-config-volume\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.364489 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c811eb7-248a-49ed-be14-95285f2c4400-config-volume\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.384587 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c811eb7-248a-49ed-be14-95285f2c4400-secret-volume\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.386895 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrwz6\" (UniqueName: \"kubernetes.io/projected/57c14e51-5c0b-467c-ba79-ac6f39239445-kube-api-access-rrwz6\") pod \"auto-csr-approver-29560350-spjzd\" (UID: \"57c14e51-5c0b-467c-ba79-ac6f39239445\") " pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.389089 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gm9n\" (UniqueName: \"kubernetes.io/projected/6c811eb7-248a-49ed-be14-95285f2c4400-kube-api-access-7gm9n\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.493418 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.507471 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.886440 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-spjzd"] Mar 16 00:30:00 crc kubenswrapper[4983]: W0316 00:30:00.895371 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57c14e51_5c0b_467c_ba79_ac6f39239445.slice/crio-02725799c1950b400c8e21a400391d33267171c7a94d9971a0d2c7afea9bd989 WatchSource:0}: Error finding container 02725799c1950b400c8e21a400391d33267171c7a94d9971a0d2c7afea9bd989: Status 404 returned error can't find the container with id 02725799c1950b400c8e21a400391d33267171c7a94d9971a0d2c7afea9bd989 Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.937408 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h"] Mar 16 00:30:00 crc kubenswrapper[4983]: W0316 00:30:00.939149 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c811eb7_248a_49ed_be14_95285f2c4400.slice/crio-d8d9f08c357bfc7cc6ab9d59d919320302923a397b75bb7715706cd3168a8dfe WatchSource:0}: Error finding container d8d9f08c357bfc7cc6ab9d59d919320302923a397b75bb7715706cd3168a8dfe: Status 404 returned error can't find the container with id d8d9f08c357bfc7cc6ab9d59d919320302923a397b75bb7715706cd3168a8dfe Mar 16 00:30:01 crc kubenswrapper[4983]: I0316 00:30:01.504724 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560350-spjzd" event={"ID":"57c14e51-5c0b-467c-ba79-ac6f39239445","Type":"ContainerStarted","Data":"02725799c1950b400c8e21a400391d33267171c7a94d9971a0d2c7afea9bd989"} Mar 16 00:30:01 crc kubenswrapper[4983]: I0316 00:30:01.506670 4983 generic.go:334] "Generic (PLEG): container finished" podID="6c811eb7-248a-49ed-be14-95285f2c4400" containerID="5bc308bbb2692ff07a3afcb9644b74f23c0b61727bee2ee07e82ad016a8bf8a0" exitCode=0 Mar 16 00:30:01 crc kubenswrapper[4983]: I0316 00:30:01.506707 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" event={"ID":"6c811eb7-248a-49ed-be14-95285f2c4400","Type":"ContainerDied","Data":"5bc308bbb2692ff07a3afcb9644b74f23c0b61727bee2ee07e82ad016a8bf8a0"} Mar 16 00:30:01 crc kubenswrapper[4983]: I0316 00:30:01.506722 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" event={"ID":"6c811eb7-248a-49ed-be14-95285f2c4400","Type":"ContainerStarted","Data":"d8d9f08c357bfc7cc6ab9d59d919320302923a397b75bb7715706cd3168a8dfe"} Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.756676 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.796125 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c811eb7-248a-49ed-be14-95285f2c4400-config-volume\") pod \"6c811eb7-248a-49ed-be14-95285f2c4400\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.796457 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gm9n\" (UniqueName: \"kubernetes.io/projected/6c811eb7-248a-49ed-be14-95285f2c4400-kube-api-access-7gm9n\") pod \"6c811eb7-248a-49ed-be14-95285f2c4400\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.796504 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c811eb7-248a-49ed-be14-95285f2c4400-secret-volume\") pod \"6c811eb7-248a-49ed-be14-95285f2c4400\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.797518 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c811eb7-248a-49ed-be14-95285f2c4400-config-volume" (OuterVolumeSpecName: "config-volume") pod "6c811eb7-248a-49ed-be14-95285f2c4400" (UID: "6c811eb7-248a-49ed-be14-95285f2c4400"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.801958 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c811eb7-248a-49ed-be14-95285f2c4400-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6c811eb7-248a-49ed-be14-95285f2c4400" (UID: "6c811eb7-248a-49ed-be14-95285f2c4400"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.811922 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c811eb7-248a-49ed-be14-95285f2c4400-kube-api-access-7gm9n" (OuterVolumeSpecName: "kube-api-access-7gm9n") pod "6c811eb7-248a-49ed-be14-95285f2c4400" (UID: "6c811eb7-248a-49ed-be14-95285f2c4400"). InnerVolumeSpecName "kube-api-access-7gm9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.898274 4983 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c811eb7-248a-49ed-be14-95285f2c4400-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.898311 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gm9n\" (UniqueName: \"kubernetes.io/projected/6c811eb7-248a-49ed-be14-95285f2c4400-kube-api-access-7gm9n\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.898321 4983 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c811eb7-248a-49ed-be14-95285f2c4400-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:03 crc kubenswrapper[4983]: I0316 00:30:03.522360 4983 generic.go:334] "Generic (PLEG): container finished" podID="57c14e51-5c0b-467c-ba79-ac6f39239445" containerID="1accee3cbf491221453bc50f3bfc4fc15297e3a876200e21860d5dd4e3e66686" exitCode=0 Mar 16 00:30:03 crc kubenswrapper[4983]: I0316 00:30:03.522546 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560350-spjzd" event={"ID":"57c14e51-5c0b-467c-ba79-ac6f39239445","Type":"ContainerDied","Data":"1accee3cbf491221453bc50f3bfc4fc15297e3a876200e21860d5dd4e3e66686"} Mar 16 00:30:03 crc kubenswrapper[4983]: I0316 00:30:03.523637 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" event={"ID":"6c811eb7-248a-49ed-be14-95285f2c4400","Type":"ContainerDied","Data":"d8d9f08c357bfc7cc6ab9d59d919320302923a397b75bb7715706cd3168a8dfe"} Mar 16 00:30:03 crc kubenswrapper[4983]: I0316 00:30:03.523664 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8d9f08c357bfc7cc6ab9d59d919320302923a397b75bb7715706cd3168a8dfe" Mar 16 00:30:03 crc kubenswrapper[4983]: I0316 00:30:03.523713 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:04 crc kubenswrapper[4983]: I0316 00:30:04.759117 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:04 crc kubenswrapper[4983]: I0316 00:30:04.821468 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrwz6\" (UniqueName: \"kubernetes.io/projected/57c14e51-5c0b-467c-ba79-ac6f39239445-kube-api-access-rrwz6\") pod \"57c14e51-5c0b-467c-ba79-ac6f39239445\" (UID: \"57c14e51-5c0b-467c-ba79-ac6f39239445\") " Mar 16 00:30:04 crc kubenswrapper[4983]: I0316 00:30:04.826220 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57c14e51-5c0b-467c-ba79-ac6f39239445-kube-api-access-rrwz6" (OuterVolumeSpecName: "kube-api-access-rrwz6") pod "57c14e51-5c0b-467c-ba79-ac6f39239445" (UID: "57c14e51-5c0b-467c-ba79-ac6f39239445"). InnerVolumeSpecName "kube-api-access-rrwz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:04 crc kubenswrapper[4983]: I0316 00:30:04.923102 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrwz6\" (UniqueName: \"kubernetes.io/projected/57c14e51-5c0b-467c-ba79-ac6f39239445-kube-api-access-rrwz6\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:05 crc kubenswrapper[4983]: I0316 00:30:05.539846 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560350-spjzd" event={"ID":"57c14e51-5c0b-467c-ba79-ac6f39239445","Type":"ContainerDied","Data":"02725799c1950b400c8e21a400391d33267171c7a94d9971a0d2c7afea9bd989"} Mar 16 00:30:05 crc kubenswrapper[4983]: I0316 00:30:05.539892 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02725799c1950b400c8e21a400391d33267171c7a94d9971a0d2c7afea9bd989" Mar 16 00:30:05 crc kubenswrapper[4983]: I0316 00:30:05.539903 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:05 crc kubenswrapper[4983]: I0316 00:30:05.820789 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-6jrbk"] Mar 16 00:30:05 crc kubenswrapper[4983]: I0316 00:30:05.824984 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-6jrbk"] Mar 16 00:30:06 crc kubenswrapper[4983]: I0316 00:30:06.100131 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="159f5145-349d-4018-a8d2-251363a76196" path="/var/lib/kubelet/pods/159f5145-349d-4018-a8d2-251363a76196/volumes" Mar 16 00:30:06 crc kubenswrapper[4983]: I0316 00:30:06.760083 4983 scope.go:117] "RemoveContainer" containerID="12df34a0b20427d6c21f033a68c9272147f096ee9a9f4ed3b7d0b3054eb18184" Mar 16 00:30:14 crc kubenswrapper[4983]: I0316 00:30:14.596921 4983 generic.go:334] "Generic (PLEG): container finished" podID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerID="5f7cbddba7e24ee372140ec3e6d1283660f1e04ffafec527fdafc57d3a279e56" exitCode=0 Mar 16 00:30:14 crc kubenswrapper[4983]: I0316 00:30:14.597018 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerDied","Data":"5f7cbddba7e24ee372140ec3e6d1283660f1e04ffafec527fdafc57d3a279e56"} Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.846806 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890158 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-run\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890226 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-node-pullsecrets\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890265 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-proxy-ca-bundles\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890291 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-system-configs\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890328 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-root\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890363 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildworkdir\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890392 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-pull\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890421 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-ca-bundles\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890446 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-blob-cache\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890473 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-push\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890505 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildcachedir\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890528 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlgqf\" (UniqueName: \"kubernetes.io/projected/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-kube-api-access-dlgqf\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890955 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.891175 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.891213 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.891494 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.892907 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.893525 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.894293 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.895775 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-kube-api-access-dlgqf" (OuterVolumeSpecName: "kube-api-access-dlgqf") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "kube-api-access-dlgqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.895798 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.896899 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.990901 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991749 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991786 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991795 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991805 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991813 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991821 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991832 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991840 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991849 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991857 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991866 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlgqf\" (UniqueName: \"kubernetes.io/projected/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-kube-api-access-dlgqf\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4983]: I0316 00:30:16.624506 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerDied","Data":"dc3a9ca53d1d789e35b74d5f0511c33239a8145d878bf5432c485cec4d0b52c4"} Mar 16 00:30:16 crc kubenswrapper[4983]: I0316 00:30:16.624554 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc3a9ca53d1d789e35b74d5f0511c33239a8145d878bf5432c485cec4d0b52c4" Mar 16 00:30:16 crc kubenswrapper[4983]: I0316 00:30:16.624608 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:30:16 crc kubenswrapper[4983]: I0316 00:30:16.760866 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4983]: I0316 00:30:16.802899 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.498613 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-64877956d4-ljbdp"] Mar 16 00:30:24 crc kubenswrapper[4983]: E0316 00:30:24.499406 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="manage-dockerfile" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499420 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="manage-dockerfile" Mar 16 00:30:24 crc kubenswrapper[4983]: E0316 00:30:24.499434 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57c14e51-5c0b-467c-ba79-ac6f39239445" containerName="oc" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499441 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="57c14e51-5c0b-467c-ba79-ac6f39239445" containerName="oc" Mar 16 00:30:24 crc kubenswrapper[4983]: E0316 00:30:24.499452 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="git-clone" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499459 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="git-clone" Mar 16 00:30:24 crc kubenswrapper[4983]: E0316 00:30:24.499467 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="docker-build" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499473 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="docker-build" Mar 16 00:30:24 crc kubenswrapper[4983]: E0316 00:30:24.499489 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c811eb7-248a-49ed-be14-95285f2c4400" containerName="collect-profiles" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499496 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c811eb7-248a-49ed-be14-95285f2c4400" containerName="collect-profiles" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499630 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c811eb7-248a-49ed-be14-95285f2c4400" containerName="collect-profiles" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499647 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="57c14e51-5c0b-467c-ba79-ac6f39239445" containerName="oc" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499660 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="docker-build" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.500159 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.502314 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-gpsrc" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.515673 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-64877956d4-ljbdp"] Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.692943 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d-runner\") pod \"smart-gateway-operator-64877956d4-ljbdp\" (UID: \"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d\") " pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.693001 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtb6s\" (UniqueName: \"kubernetes.io/projected/5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d-kube-api-access-vtb6s\") pod \"smart-gateway-operator-64877956d4-ljbdp\" (UID: \"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d\") " pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.794041 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d-runner\") pod \"smart-gateway-operator-64877956d4-ljbdp\" (UID: \"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d\") " pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.794291 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtb6s\" (UniqueName: \"kubernetes.io/projected/5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d-kube-api-access-vtb6s\") pod \"smart-gateway-operator-64877956d4-ljbdp\" (UID: \"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d\") " pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.794655 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d-runner\") pod \"smart-gateway-operator-64877956d4-ljbdp\" (UID: \"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d\") " pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.816697 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtb6s\" (UniqueName: \"kubernetes.io/projected/5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d-kube-api-access-vtb6s\") pod \"smart-gateway-operator-64877956d4-ljbdp\" (UID: \"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d\") " pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.818356 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:25 crc kubenswrapper[4983]: I0316 00:30:25.321287 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-64877956d4-ljbdp"] Mar 16 00:30:25 crc kubenswrapper[4983]: W0316 00:30:25.329573 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c3140b4_67ae_4012_bd8b_9cecbcb4ff4d.slice/crio-0e1782a6c96d0af0091b346a4b2a0ea56e6c06b786cb5ce795e42cd398cbce58 WatchSource:0}: Error finding container 0e1782a6c96d0af0091b346a4b2a0ea56e6c06b786cb5ce795e42cd398cbce58: Status 404 returned error can't find the container with id 0e1782a6c96d0af0091b346a4b2a0ea56e6c06b786cb5ce795e42cd398cbce58 Mar 16 00:30:25 crc kubenswrapper[4983]: I0316 00:30:25.678119 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" event={"ID":"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d","Type":"ContainerStarted","Data":"0e1782a6c96d0af0091b346a4b2a0ea56e6c06b786cb5ce795e42cd398cbce58"} Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.292342 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-65fdb44596-qnp9k"] Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.293596 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.296331 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-q6nrk" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.327213 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-65fdb44596-qnp9k"] Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.449188 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96k2g\" (UniqueName: \"kubernetes.io/projected/b8f4edcf-0403-4d59-b045-e618c6aabff5-kube-api-access-96k2g\") pod \"service-telemetry-operator-65fdb44596-qnp9k\" (UID: \"b8f4edcf-0403-4d59-b045-e618c6aabff5\") " pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.449244 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/b8f4edcf-0403-4d59-b045-e618c6aabff5-runner\") pod \"service-telemetry-operator-65fdb44596-qnp9k\" (UID: \"b8f4edcf-0403-4d59-b045-e618c6aabff5\") " pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.550231 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96k2g\" (UniqueName: \"kubernetes.io/projected/b8f4edcf-0403-4d59-b045-e618c6aabff5-kube-api-access-96k2g\") pod \"service-telemetry-operator-65fdb44596-qnp9k\" (UID: \"b8f4edcf-0403-4d59-b045-e618c6aabff5\") " pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.550279 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/b8f4edcf-0403-4d59-b045-e618c6aabff5-runner\") pod \"service-telemetry-operator-65fdb44596-qnp9k\" (UID: \"b8f4edcf-0403-4d59-b045-e618c6aabff5\") " pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.551066 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/b8f4edcf-0403-4d59-b045-e618c6aabff5-runner\") pod \"service-telemetry-operator-65fdb44596-qnp9k\" (UID: \"b8f4edcf-0403-4d59-b045-e618c6aabff5\") " pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.569669 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96k2g\" (UniqueName: \"kubernetes.io/projected/b8f4edcf-0403-4d59-b045-e618c6aabff5-kube-api-access-96k2g\") pod \"service-telemetry-operator-65fdb44596-qnp9k\" (UID: \"b8f4edcf-0403-4d59-b045-e618c6aabff5\") " pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.620661 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:35 crc kubenswrapper[4983]: I0316 00:30:35.294976 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-65fdb44596-qnp9k"] Mar 16 00:30:36 crc kubenswrapper[4983]: W0316 00:30:36.635305 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f4edcf_0403_4d59_b045_e618c6aabff5.slice/crio-c233c6c0ae186025d223961e27523063970ef3f5f2ca414126761bf4d6844634 WatchSource:0}: Error finding container c233c6c0ae186025d223961e27523063970ef3f5f2ca414126761bf4d6844634: Status 404 returned error can't find the container with id c233c6c0ae186025d223961e27523063970ef3f5f2ca414126761bf4d6844634 Mar 16 00:30:36 crc kubenswrapper[4983]: I0316 00:30:36.751807 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" event={"ID":"b8f4edcf-0403-4d59-b045-e618c6aabff5","Type":"ContainerStarted","Data":"c233c6c0ae186025d223961e27523063970ef3f5f2ca414126761bf4d6844634"} Mar 16 00:30:40 crc kubenswrapper[4983]: E0316 00:30:40.033676 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:latest" Mar 16 00:30:40 crc kubenswrapper[4983]: E0316 00:30:40.034173 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1773621017,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vtb6s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-64877956d4-ljbdp_service-telemetry(5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 16 00:30:40 crc kubenswrapper[4983]: E0316 00:30:40.035442 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" podUID="5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d" Mar 16 00:30:40 crc kubenswrapper[4983]: E0316 00:30:40.785653 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:latest\\\"\"" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" podUID="5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d" Mar 16 00:30:42 crc kubenswrapper[4983]: I0316 00:30:42.864854 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6ztth"] Mar 16 00:30:42 crc kubenswrapper[4983]: I0316 00:30:42.868072 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:42 crc kubenswrapper[4983]: I0316 00:30:42.868394 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ztth"] Mar 16 00:30:42 crc kubenswrapper[4983]: I0316 00:30:42.946538 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-utilities\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:42 crc kubenswrapper[4983]: I0316 00:30:42.946583 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt6r9\" (UniqueName: \"kubernetes.io/projected/655a4c8a-248d-4630-889a-0932c2c2f2b9-kube-api-access-dt6r9\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:42 crc kubenswrapper[4983]: I0316 00:30:42.946608 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-catalog-content\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.047379 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-utilities\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.047703 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt6r9\" (UniqueName: \"kubernetes.io/projected/655a4c8a-248d-4630-889a-0932c2c2f2b9-kube-api-access-dt6r9\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.047740 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-catalog-content\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.047893 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-utilities\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.048092 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-catalog-content\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.067906 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt6r9\" (UniqueName: \"kubernetes.io/projected/655a4c8a-248d-4630-889a-0932c2c2f2b9-kube-api-access-dt6r9\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.189438 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:44 crc kubenswrapper[4983]: I0316 00:30:44.668570 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ztth"] Mar 16 00:30:44 crc kubenswrapper[4983]: I0316 00:30:44.806434 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerStarted","Data":"ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff"} Mar 16 00:30:44 crc kubenswrapper[4983]: I0316 00:30:44.806683 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerStarted","Data":"c2f45a75c848cc0c542208a6f9b28b0dde0446c369790008bc4a1b6a2fd51cfb"} Mar 16 00:30:44 crc kubenswrapper[4983]: I0316 00:30:44.810377 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" event={"ID":"b8f4edcf-0403-4d59-b045-e618c6aabff5","Type":"ContainerStarted","Data":"9543f2b0720f02916d77fb37da1892affb82983543db57540f3d393228b0e571"} Mar 16 00:30:44 crc kubenswrapper[4983]: I0316 00:30:44.844385 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" podStartSLOduration=8.933393343 podStartE2EDuration="16.844365667s" podCreationTimestamp="2026-03-16 00:30:28 +0000 UTC" firstStartedPulling="2026-03-16 00:30:36.645214473 +0000 UTC m=+1445.245312923" lastFinishedPulling="2026-03-16 00:30:44.556186817 +0000 UTC m=+1453.156285247" observedRunningTime="2026-03-16 00:30:44.838982543 +0000 UTC m=+1453.439080973" watchObservedRunningTime="2026-03-16 00:30:44.844365667 +0000 UTC m=+1453.444464097" Mar 16 00:30:45 crc kubenswrapper[4983]: I0316 00:30:45.817385 4983 generic.go:334] "Generic (PLEG): container finished" podID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerID="ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff" exitCode=0 Mar 16 00:30:45 crc kubenswrapper[4983]: I0316 00:30:45.817436 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerDied","Data":"ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff"} Mar 16 00:30:47 crc kubenswrapper[4983]: I0316 00:30:47.831437 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerStarted","Data":"b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df"} Mar 16 00:30:48 crc kubenswrapper[4983]: I0316 00:30:48.838474 4983 generic.go:334] "Generic (PLEG): container finished" podID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerID="b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df" exitCode=0 Mar 16 00:30:48 crc kubenswrapper[4983]: I0316 00:30:48.838564 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerDied","Data":"b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df"} Mar 16 00:30:49 crc kubenswrapper[4983]: I0316 00:30:49.846517 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerStarted","Data":"5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a"} Mar 16 00:30:49 crc kubenswrapper[4983]: I0316 00:30:49.867943 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6ztth" podStartSLOduration=4.239481924 podStartE2EDuration="7.867926095s" podCreationTimestamp="2026-03-16 00:30:42 +0000 UTC" firstStartedPulling="2026-03-16 00:30:45.818705596 +0000 UTC m=+1454.418804026" lastFinishedPulling="2026-03-16 00:30:49.447149767 +0000 UTC m=+1458.047248197" observedRunningTime="2026-03-16 00:30:49.863495197 +0000 UTC m=+1458.463593627" watchObservedRunningTime="2026-03-16 00:30:49.867926095 +0000 UTC m=+1458.468024515" Mar 16 00:30:53 crc kubenswrapper[4983]: I0316 00:30:53.202435 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:53 crc kubenswrapper[4983]: I0316 00:30:53.202938 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:54 crc kubenswrapper[4983]: I0316 00:30:54.258645 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6ztth" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="registry-server" probeResult="failure" output=< Mar 16 00:30:54 crc kubenswrapper[4983]: timeout: failed to connect service ":50051" within 1s Mar 16 00:30:54 crc kubenswrapper[4983]: > Mar 16 00:30:54 crc kubenswrapper[4983]: I0316 00:30:54.881635 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" event={"ID":"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d","Type":"ContainerStarted","Data":"475cbfbd9b8abe2a8b2f00a7f8c8062d6047c6f9d50e30e6165d3bc7beb1a7af"} Mar 16 00:30:54 crc kubenswrapper[4983]: I0316 00:30:54.899787 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" podStartSLOduration=2.439470684 podStartE2EDuration="30.899770104s" podCreationTimestamp="2026-03-16 00:30:24 +0000 UTC" firstStartedPulling="2026-03-16 00:30:25.331584002 +0000 UTC m=+1433.931682432" lastFinishedPulling="2026-03-16 00:30:53.791883422 +0000 UTC m=+1462.391981852" observedRunningTime="2026-03-16 00:30:54.897874873 +0000 UTC m=+1463.497973293" watchObservedRunningTime="2026-03-16 00:30:54.899770104 +0000 UTC m=+1463.499868534" Mar 16 00:31:03 crc kubenswrapper[4983]: I0316 00:31:03.243081 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:31:03 crc kubenswrapper[4983]: I0316 00:31:03.321003 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:31:03 crc kubenswrapper[4983]: I0316 00:31:03.487353 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6ztth"] Mar 16 00:31:04 crc kubenswrapper[4983]: I0316 00:31:04.947640 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6ztth" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="registry-server" containerID="cri-o://5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a" gracePeriod=2 Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.090387 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p4tqx"] Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.091724 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.096288 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.096500 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.096832 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.097048 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.097192 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.097205 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-kdvm9" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.097381 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.113590 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p4tqx"] Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249032 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-users\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249116 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249177 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249212 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249228 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-config\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249286 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wcbv\" (UniqueName: \"kubernetes.io/projected/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-kube-api-access-4wcbv\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249307 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.350476 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-users\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.350588 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.350638 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.350665 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.350707 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-config\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.352529 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-config\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.353207 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcbv\" (UniqueName: \"kubernetes.io/projected/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-kube-api-access-4wcbv\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.353293 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.354045 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.358630 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-users\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.358635 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.358635 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.369401 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.369523 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.372446 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcbv\" (UniqueName: \"kubernetes.io/projected/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-kube-api-access-4wcbv\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.421943 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.454382 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-utilities\") pod \"655a4c8a-248d-4630-889a-0932c2c2f2b9\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.454469 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-catalog-content\") pod \"655a4c8a-248d-4630-889a-0932c2c2f2b9\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.454574 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt6r9\" (UniqueName: \"kubernetes.io/projected/655a4c8a-248d-4630-889a-0932c2c2f2b9-kube-api-access-dt6r9\") pod \"655a4c8a-248d-4630-889a-0932c2c2f2b9\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.456185 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-utilities" (OuterVolumeSpecName: "utilities") pod "655a4c8a-248d-4630-889a-0932c2c2f2b9" (UID: "655a4c8a-248d-4630-889a-0932c2c2f2b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.457857 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/655a4c8a-248d-4630-889a-0932c2c2f2b9-kube-api-access-dt6r9" (OuterVolumeSpecName: "kube-api-access-dt6r9") pod "655a4c8a-248d-4630-889a-0932c2c2f2b9" (UID: "655a4c8a-248d-4630-889a-0932c2c2f2b9"). InnerVolumeSpecName "kube-api-access-dt6r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.556039 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.556098 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt6r9\" (UniqueName: \"kubernetes.io/projected/655a4c8a-248d-4630-889a-0932c2c2f2b9-kube-api-access-dt6r9\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.599595 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "655a4c8a-248d-4630-889a-0932c2c2f2b9" (UID: "655a4c8a-248d-4630-889a-0932c2c2f2b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.601780 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p4tqx"] Mar 16 00:31:05 crc kubenswrapper[4983]: W0316 00:31:05.606518 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5e69fb8_91f6_4bfc_b8a5_2f9e77922ac3.slice/crio-914a76d3eee89820d29491da1b012ee2154bcaedef1aaca3a8fafb3b8f5f37b8 WatchSource:0}: Error finding container 914a76d3eee89820d29491da1b012ee2154bcaedef1aaca3a8fafb3b8f5f37b8: Status 404 returned error can't find the container with id 914a76d3eee89820d29491da1b012ee2154bcaedef1aaca3a8fafb3b8f5f37b8 Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.657476 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.957303 4983 generic.go:334] "Generic (PLEG): container finished" podID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerID="5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a" exitCode=0 Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.957379 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerDied","Data":"5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a"} Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.957410 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerDied","Data":"c2f45a75c848cc0c542208a6f9b28b0dde0446c369790008bc4a1b6a2fd51cfb"} Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.957430 4983 scope.go:117] "RemoveContainer" containerID="5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.957585 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.965138 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" event={"ID":"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3","Type":"ContainerStarted","Data":"914a76d3eee89820d29491da1b012ee2154bcaedef1aaca3a8fafb3b8f5f37b8"} Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.981320 4983 scope.go:117] "RemoveContainer" containerID="b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.996919 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6ztth"] Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.003312 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6ztth"] Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.006792 4983 scope.go:117] "RemoveContainer" containerID="ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.024671 4983 scope.go:117] "RemoveContainer" containerID="5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a" Mar 16 00:31:06 crc kubenswrapper[4983]: E0316 00:31:06.025155 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a\": container with ID starting with 5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a not found: ID does not exist" containerID="5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.025195 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a"} err="failed to get container status \"5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a\": rpc error: code = NotFound desc = could not find container \"5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a\": container with ID starting with 5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a not found: ID does not exist" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.025225 4983 scope.go:117] "RemoveContainer" containerID="b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df" Mar 16 00:31:06 crc kubenswrapper[4983]: E0316 00:31:06.025547 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df\": container with ID starting with b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df not found: ID does not exist" containerID="b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.025580 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df"} err="failed to get container status \"b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df\": rpc error: code = NotFound desc = could not find container \"b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df\": container with ID starting with b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df not found: ID does not exist" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.025600 4983 scope.go:117] "RemoveContainer" containerID="ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff" Mar 16 00:31:06 crc kubenswrapper[4983]: E0316 00:31:06.025885 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff\": container with ID starting with ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff not found: ID does not exist" containerID="ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.025917 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff"} err="failed to get container status \"ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff\": rpc error: code = NotFound desc = could not find container \"ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff\": container with ID starting with ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff not found: ID does not exist" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.105233 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" path="/var/lib/kubelet/pods/655a4c8a-248d-4630-889a-0932c2c2f2b9/volumes" Mar 16 00:31:12 crc kubenswrapper[4983]: I0316 00:31:12.003799 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" event={"ID":"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3","Type":"ContainerStarted","Data":"9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235"} Mar 16 00:31:12 crc kubenswrapper[4983]: I0316 00:31:12.022244 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" podStartSLOduration=1.608243621 podStartE2EDuration="7.022224227s" podCreationTimestamp="2026-03-16 00:31:05 +0000 UTC" firstStartedPulling="2026-03-16 00:31:05.610577239 +0000 UTC m=+1474.210675669" lastFinishedPulling="2026-03-16 00:31:11.024557815 +0000 UTC m=+1479.624656275" observedRunningTime="2026-03-16 00:31:12.019652698 +0000 UTC m=+1480.619751128" watchObservedRunningTime="2026-03-16 00:31:12.022224227 +0000 UTC m=+1480.622322657" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.319818 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 16 00:31:15 crc kubenswrapper[4983]: E0316 00:31:15.320470 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="registry-server" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.320489 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="registry-server" Mar 16 00:31:15 crc kubenswrapper[4983]: E0316 00:31:15.320526 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="extract-utilities" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.320539 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="extract-utilities" Mar 16 00:31:15 crc kubenswrapper[4983]: E0316 00:31:15.320564 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="extract-content" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.320576 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="extract-content" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.320784 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="registry-server" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.322427 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.325138 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.325337 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.325581 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.325771 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.325935 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.326064 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.326211 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.326373 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.328383 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.337267 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.338500 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-hpv2q" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501527 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c952d0a-6462-4081-8603-935847aefe14-tls-assets\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501579 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501624 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501791 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501855 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501911 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501965 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-config\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.502046 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c952d0a-6462-4081-8603-935847aefe14-config-out\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.502084 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.502125 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-web-config\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.502200 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.502227 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmjlh\" (UniqueName: \"kubernetes.io/projected/4c952d0a-6462-4081-8603-935847aefe14-kube-api-access-dmjlh\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603374 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603437 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603473 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603509 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603540 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-config\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603578 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c952d0a-6462-4081-8603-935847aefe14-config-out\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603612 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603648 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-web-config\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: E0316 00:31:15.603667 4983 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 16 00:31:15 crc kubenswrapper[4983]: E0316 00:31:15.603807 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls podName:4c952d0a-6462-4081-8603-935847aefe14 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:16.103735275 +0000 UTC m=+1484.703833715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "4c952d0a-6462-4081-8603-935847aefe14") : secret "default-prometheus-proxy-tls" not found Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603682 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.604144 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmjlh\" (UniqueName: \"kubernetes.io/projected/4c952d0a-6462-4081-8603-935847aefe14-kube-api-access-dmjlh\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.604280 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c952d0a-6462-4081-8603-935847aefe14-tls-assets\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.604326 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.604394 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.604610 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.604948 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.605318 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.610160 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-web-config\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.610254 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-config\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.610327 4983 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.610508 4983 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d26c15a88fbfc712781200258d29c496371f2e856d0ee515a41c781249c487a5/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.610529 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c952d0a-6462-4081-8603-935847aefe14-tls-assets\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.611676 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.618960 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c952d0a-6462-4081-8603-935847aefe14-config-out\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.628685 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmjlh\" (UniqueName: \"kubernetes.io/projected/4c952d0a-6462-4081-8603-935847aefe14-kube-api-access-dmjlh\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.656966 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:16 crc kubenswrapper[4983]: I0316 00:31:16.112192 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:16 crc kubenswrapper[4983]: E0316 00:31:16.112393 4983 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 16 00:31:16 crc kubenswrapper[4983]: E0316 00:31:16.112465 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls podName:4c952d0a-6462-4081-8603-935847aefe14 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:17.112446009 +0000 UTC m=+1485.712544449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "4c952d0a-6462-4081-8603-935847aefe14") : secret "default-prometheus-proxy-tls" not found Mar 16 00:31:17 crc kubenswrapper[4983]: I0316 00:31:17.125510 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:17 crc kubenswrapper[4983]: I0316 00:31:17.132603 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:17 crc kubenswrapper[4983]: I0316 00:31:17.156102 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 16 00:31:17 crc kubenswrapper[4983]: I0316 00:31:17.559190 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 16 00:31:17 crc kubenswrapper[4983]: W0316 00:31:17.581911 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c952d0a_6462_4081_8603_935847aefe14.slice/crio-f2154d67f0d902a913fa5c0f9f59f38ed9c42f8d7c69d3bdb5686fca4f3dc16f WatchSource:0}: Error finding container f2154d67f0d902a913fa5c0f9f59f38ed9c42f8d7c69d3bdb5686fca4f3dc16f: Status 404 returned error can't find the container with id f2154d67f0d902a913fa5c0f9f59f38ed9c42f8d7c69d3bdb5686fca4f3dc16f Mar 16 00:31:18 crc kubenswrapper[4983]: I0316 00:31:18.049741 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4c952d0a-6462-4081-8603-935847aefe14","Type":"ContainerStarted","Data":"f2154d67f0d902a913fa5c0f9f59f38ed9c42f8d7c69d3bdb5686fca4f3dc16f"} Mar 16 00:31:22 crc kubenswrapper[4983]: I0316 00:31:22.082466 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4c952d0a-6462-4081-8603-935847aefe14","Type":"ContainerStarted","Data":"a4be303f3d3093e70f5251c499293adcd190f43b473540357b97c4147dad99f7"} Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.083559 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-4h8qf"] Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.085369 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.095975 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-4h8qf"] Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.239530 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z8b4\" (UniqueName: \"kubernetes.io/projected/1f2a95c7-282e-4200-ac63-1a114726205b-kube-api-access-5z8b4\") pod \"default-snmp-webhook-6856cfb745-4h8qf\" (UID: \"1f2a95c7-282e-4200-ac63-1a114726205b\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.341115 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z8b4\" (UniqueName: \"kubernetes.io/projected/1f2a95c7-282e-4200-ac63-1a114726205b-kube-api-access-5z8b4\") pod \"default-snmp-webhook-6856cfb745-4h8qf\" (UID: \"1f2a95c7-282e-4200-ac63-1a114726205b\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.374038 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z8b4\" (UniqueName: \"kubernetes.io/projected/1f2a95c7-282e-4200-ac63-1a114726205b-kube-api-access-5z8b4\") pod \"default-snmp-webhook-6856cfb745-4h8qf\" (UID: \"1f2a95c7-282e-4200-ac63-1a114726205b\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.403410 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.635858 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-4h8qf"] Mar 16 00:31:25 crc kubenswrapper[4983]: W0316 00:31:25.656199 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f2a95c7_282e_4200_ac63_1a114726205b.slice/crio-36f9dc083475ccd68eae41ae2165e36d12e11efa792baea2630c7e3d0614a8c9 WatchSource:0}: Error finding container 36f9dc083475ccd68eae41ae2165e36d12e11efa792baea2630c7e3d0614a8c9: Status 404 returned error can't find the container with id 36f9dc083475ccd68eae41ae2165e36d12e11efa792baea2630c7e3d0614a8c9 Mar 16 00:31:26 crc kubenswrapper[4983]: I0316 00:31:26.127564 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" event={"ID":"1f2a95c7-282e-4200-ac63-1a114726205b","Type":"ContainerStarted","Data":"36f9dc083475ccd68eae41ae2165e36d12e11efa792baea2630c7e3d0614a8c9"} Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.956506 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.959172 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.964110 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.964300 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-lptq9" Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.964198 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.964300 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.964497 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.965013 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.969549 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099563 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f21bd8c3-505c-465a-afeb-404a9136ea58-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099608 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jpw8\" (UniqueName: \"kubernetes.io/projected/f21bd8c3-505c-465a-afeb-404a9136ea58-kube-api-access-4jpw8\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099653 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-config-volume\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099741 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099778 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099806 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099825 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a35c1573-8441-4092-8560-86f7726028dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a35c1573-8441-4092-8560-86f7726028dc\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099860 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-web-config\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099880 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f21bd8c3-505c-465a-afeb-404a9136ea58-config-out\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.149468 4983 generic.go:334] "Generic (PLEG): container finished" podID="4c952d0a-6462-4081-8603-935847aefe14" containerID="a4be303f3d3093e70f5251c499293adcd190f43b473540357b97c4147dad99f7" exitCode=0 Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.149522 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4c952d0a-6462-4081-8603-935847aefe14","Type":"ContainerDied","Data":"a4be303f3d3093e70f5251c499293adcd190f43b473540357b97c4147dad99f7"} Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201214 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f21bd8c3-505c-465a-afeb-404a9136ea58-config-out\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201289 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f21bd8c3-505c-465a-afeb-404a9136ea58-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201317 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jpw8\" (UniqueName: \"kubernetes.io/projected/f21bd8c3-505c-465a-afeb-404a9136ea58-kube-api-access-4jpw8\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201362 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-config-volume\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201384 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201406 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201425 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201447 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a35c1573-8441-4092-8560-86f7726028dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a35c1573-8441-4092-8560-86f7726028dc\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: E0316 00:31:29.201566 4983 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 16 00:31:29 crc kubenswrapper[4983]: E0316 00:31:29.201618 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls podName:f21bd8c3-505c-465a-afeb-404a9136ea58 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:29.701601708 +0000 UTC m=+1498.301700138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f21bd8c3-505c-465a-afeb-404a9136ea58") : secret "default-alertmanager-proxy-tls" not found Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201797 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-web-config\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.206968 4983 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.207164 4983 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a35c1573-8441-4092-8560-86f7726028dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a35c1573-8441-4092-8560-86f7726028dc\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/be416d59a4e641c720596f27e26bd8c509ca92bf347a2cb40479f09ccc772acf/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.207426 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f21bd8c3-505c-465a-afeb-404a9136ea58-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.207439 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.208379 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-web-config\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.208397 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f21bd8c3-505c-465a-afeb-404a9136ea58-config-out\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.209028 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-config-volume\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.212553 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.227735 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jpw8\" (UniqueName: \"kubernetes.io/projected/f21bd8c3-505c-465a-afeb-404a9136ea58-kube-api-access-4jpw8\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.239095 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a35c1573-8441-4092-8560-86f7726028dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a35c1573-8441-4092-8560-86f7726028dc\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.709002 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: E0316 00:31:29.709209 4983 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 16 00:31:29 crc kubenswrapper[4983]: E0316 00:31:29.709310 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls podName:f21bd8c3-505c-465a-afeb-404a9136ea58 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:30.709291995 +0000 UTC m=+1499.309390425 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f21bd8c3-505c-465a-afeb-404a9136ea58") : secret "default-alertmanager-proxy-tls" not found Mar 16 00:31:30 crc kubenswrapper[4983]: I0316 00:31:30.722308 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:30 crc kubenswrapper[4983]: E0316 00:31:30.722541 4983 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 16 00:31:30 crc kubenswrapper[4983]: E0316 00:31:30.722740 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls podName:f21bd8c3-505c-465a-afeb-404a9136ea58 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:32.722719977 +0000 UTC m=+1501.322818407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f21bd8c3-505c-465a-afeb-404a9136ea58") : secret "default-alertmanager-proxy-tls" not found Mar 16 00:31:32 crc kubenswrapper[4983]: I0316 00:31:32.752328 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:32 crc kubenswrapper[4983]: I0316 00:31:32.778494 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:32 crc kubenswrapper[4983]: I0316 00:31:32.893896 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-lptq9" Mar 16 00:31:32 crc kubenswrapper[4983]: I0316 00:31:32.902589 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:33 crc kubenswrapper[4983]: I0316 00:31:33.174054 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" event={"ID":"1f2a95c7-282e-4200-ac63-1a114726205b","Type":"ContainerStarted","Data":"be75eb3523134a08101e6f9f2c283897f22428d514bb003660b28d3d65d781db"} Mar 16 00:31:33 crc kubenswrapper[4983]: I0316 00:31:33.188745 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" podStartSLOduration=1.270071024 podStartE2EDuration="8.18873036s" podCreationTimestamp="2026-03-16 00:31:25 +0000 UTC" firstStartedPulling="2026-03-16 00:31:25.657989791 +0000 UTC m=+1494.258088241" lastFinishedPulling="2026-03-16 00:31:32.576649147 +0000 UTC m=+1501.176747577" observedRunningTime="2026-03-16 00:31:33.187641721 +0000 UTC m=+1501.787740151" watchObservedRunningTime="2026-03-16 00:31:33.18873036 +0000 UTC m=+1501.788828790" Mar 16 00:31:33 crc kubenswrapper[4983]: I0316 00:31:33.372147 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 16 00:31:34 crc kubenswrapper[4983]: I0316 00:31:34.181998 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f21bd8c3-505c-465a-afeb-404a9136ea58","Type":"ContainerStarted","Data":"5533756ed7ab22efdb85b446e09f294bae54d82deec5fe73e3c9c19b6ba14572"} Mar 16 00:31:35 crc kubenswrapper[4983]: I0316 00:31:35.195224 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f21bd8c3-505c-465a-afeb-404a9136ea58","Type":"ContainerStarted","Data":"2c52f71e3a3618defbe3cada2f72b680a7ee28c3d43c22c86d95d1f72f9090c0"} Mar 16 00:31:37 crc kubenswrapper[4983]: I0316 00:31:37.210515 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4c952d0a-6462-4081-8603-935847aefe14","Type":"ContainerStarted","Data":"de037c5a808a8b93cfe8e1480355cbc928d4ee7e260f061ff9177e929bfdbeeb"} Mar 16 00:31:40 crc kubenswrapper[4983]: I0316 00:31:40.233647 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4c952d0a-6462-4081-8603-935847aefe14","Type":"ContainerStarted","Data":"12824209a1723cb781c2e7089d7056063a2a17a195feeea948f170be7f02a146"} Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.240897 4983 generic.go:334] "Generic (PLEG): container finished" podID="f21bd8c3-505c-465a-afeb-404a9136ea58" containerID="2c52f71e3a3618defbe3cada2f72b680a7ee28c3d43c22c86d95d1f72f9090c0" exitCode=0 Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.240940 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f21bd8c3-505c-465a-afeb-404a9136ea58","Type":"ContainerDied","Data":"2c52f71e3a3618defbe3cada2f72b680a7ee28c3d43c22c86d95d1f72f9090c0"} Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.491843 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82"] Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.493547 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.497743 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.498552 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-qksbs" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.498605 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.498658 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.508658 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82"] Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.578216 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.578298 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.578319 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b5da06a-5282-43dc-b876-76eb99ba6f9d-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.578351 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjmgt\" (UniqueName: \"kubernetes.io/projected/1b5da06a-5282-43dc-b876-76eb99ba6f9d-kube-api-access-cjmgt\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.578370 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1b5da06a-5282-43dc-b876-76eb99ba6f9d-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.679655 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.680783 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.680832 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b5da06a-5282-43dc-b876-76eb99ba6f9d-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.680882 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjmgt\" (UniqueName: \"kubernetes.io/projected/1b5da06a-5282-43dc-b876-76eb99ba6f9d-kube-api-access-cjmgt\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.680909 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1b5da06a-5282-43dc-b876-76eb99ba6f9d-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.681339 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b5da06a-5282-43dc-b876-76eb99ba6f9d-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: E0316 00:31:41.681405 4983 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:31:41 crc kubenswrapper[4983]: E0316 00:31:41.681463 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls podName:1b5da06a-5282-43dc-b876-76eb99ba6f9d nodeName:}" failed. No retries permitted until 2026-03-16 00:31:42.181448318 +0000 UTC m=+1510.781546748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" (UID: "1b5da06a-5282-43dc-b876-76eb99ba6f9d") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.681908 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1b5da06a-5282-43dc-b876-76eb99ba6f9d-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.696435 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.707550 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjmgt\" (UniqueName: \"kubernetes.io/projected/1b5da06a-5282-43dc-b876-76eb99ba6f9d-kube-api-access-cjmgt\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:42 crc kubenswrapper[4983]: I0316 00:31:42.191495 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:42 crc kubenswrapper[4983]: E0316 00:31:42.191787 4983 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:31:42 crc kubenswrapper[4983]: E0316 00:31:42.192387 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls podName:1b5da06a-5282-43dc-b876-76eb99ba6f9d nodeName:}" failed. No retries permitted until 2026-03-16 00:31:43.192360091 +0000 UTC m=+1511.792458521 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" (UID: "1b5da06a-5282-43dc-b876-76eb99ba6f9d") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:31:43 crc kubenswrapper[4983]: I0316 00:31:43.212845 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:43 crc kubenswrapper[4983]: I0316 00:31:43.223199 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:43 crc kubenswrapper[4983]: I0316 00:31:43.319058 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:43 crc kubenswrapper[4983]: I0316 00:31:43.778243 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82"] Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.126953 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf"] Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.129725 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.133865 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.133886 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.138660 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf"] Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.227476 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.227541 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzp5z\" (UniqueName: \"kubernetes.io/projected/c6f00393-4848-47e5-8836-3e3b9c3a5b95-kube-api-access-mzp5z\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.227656 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.227683 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c6f00393-4848-47e5-8836-3e3b9c3a5b95-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.227712 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c6f00393-4848-47e5-8836-3e3b9c3a5b95-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.265126 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerStarted","Data":"df64c22da3a2e39417b17fad62d7e7fb95f18d4d0420be571ad099b7cfe6a4f1"} Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.328381 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c6f00393-4848-47e5-8836-3e3b9c3a5b95-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.328426 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c6f00393-4848-47e5-8836-3e3b9c3a5b95-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.328480 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.328508 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzp5z\" (UniqueName: \"kubernetes.io/projected/c6f00393-4848-47e5-8836-3e3b9c3a5b95-kube-api-access-mzp5z\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.328559 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: E0316 00:31:44.329381 4983 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:31:44 crc kubenswrapper[4983]: E0316 00:31:44.329456 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls podName:c6f00393-4848-47e5-8836-3e3b9c3a5b95 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:44.829434307 +0000 UTC m=+1513.429532737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" (UID: "c6f00393-4848-47e5-8836-3e3b9c3a5b95") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.329664 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c6f00393-4848-47e5-8836-3e3b9c3a5b95-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.330312 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c6f00393-4848-47e5-8836-3e3b9c3a5b95-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.333886 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.347586 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzp5z\" (UniqueName: \"kubernetes.io/projected/c6f00393-4848-47e5-8836-3e3b9c3a5b95-kube-api-access-mzp5z\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.836361 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: E0316 00:31:44.836552 4983 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:31:44 crc kubenswrapper[4983]: E0316 00:31:44.836965 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls podName:c6f00393-4848-47e5-8836-3e3b9c3a5b95 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:45.836941848 +0000 UTC m=+1514.437040288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" (UID: "c6f00393-4848-47e5-8836-3e3b9c3a5b95") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:31:45 crc kubenswrapper[4983]: I0316 00:31:45.852582 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:45 crc kubenswrapper[4983]: I0316 00:31:45.858623 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:45 crc kubenswrapper[4983]: I0316 00:31:45.956634 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:46 crc kubenswrapper[4983]: I0316 00:31:46.490130 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf"] Mar 16 00:31:49 crc kubenswrapper[4983]: I0316 00:31:49.309684 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerStarted","Data":"8bd030eca22010c76446e25eb2fc48acfa8c5fac806edeae1a7316946c3d9e94"} Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.383895 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9"] Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.385867 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.391590 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.391793 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.393520 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9"] Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.522916 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f90f8a5e-67de-4058-9e42-0caf957b6b71-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.523021 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gldcq\" (UniqueName: \"kubernetes.io/projected/f90f8a5e-67de-4058-9e42-0caf957b6b71-kube-api-access-gldcq\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.523059 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f90f8a5e-67de-4058-9e42-0caf957b6b71-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.523152 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.523212 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.624098 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gldcq\" (UniqueName: \"kubernetes.io/projected/f90f8a5e-67de-4058-9e42-0caf957b6b71-kube-api-access-gldcq\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.624182 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f90f8a5e-67de-4058-9e42-0caf957b6b71-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.624256 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.624304 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.624343 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f90f8a5e-67de-4058-9e42-0caf957b6b71-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: E0316 00:31:50.624547 4983 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:31:50 crc kubenswrapper[4983]: E0316 00:31:50.624627 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls podName:f90f8a5e-67de-4058-9e42-0caf957b6b71 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:51.124611325 +0000 UTC m=+1519.724709845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" (UID: "f90f8a5e-67de-4058-9e42-0caf957b6b71") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.624864 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f90f8a5e-67de-4058-9e42-0caf957b6b71-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.626992 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f90f8a5e-67de-4058-9e42-0caf957b6b71-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.630419 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.641266 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gldcq\" (UniqueName: \"kubernetes.io/projected/f90f8a5e-67de-4058-9e42-0caf957b6b71-kube-api-access-gldcq\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:51 crc kubenswrapper[4983]: I0316 00:31:51.131572 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:51 crc kubenswrapper[4983]: E0316 00:31:51.131976 4983 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:31:51 crc kubenswrapper[4983]: E0316 00:31:51.132071 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls podName:f90f8a5e-67de-4058-9e42-0caf957b6b71 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:52.132046546 +0000 UTC m=+1520.732144976 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" (UID: "f90f8a5e-67de-4058-9e42-0caf957b6b71") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:31:51 crc kubenswrapper[4983]: I0316 00:31:51.334995 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4c952d0a-6462-4081-8603-935847aefe14","Type":"ContainerStarted","Data":"588319b4fba0eb4322bbf9e60c60f0d283ea764278fa05a6a525acbc14c99fbd"} Mar 16 00:31:51 crc kubenswrapper[4983]: I0316 00:31:51.342375 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerStarted","Data":"72f9951154030ce2edcf04af07553d25087ad4a56d8d1c37f77bd72ddbca233b"} Mar 16 00:31:51 crc kubenswrapper[4983]: I0316 00:31:51.345047 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerStarted","Data":"8015ab262caa609b03d18e295501cedb02b1d7eb16855c09aae9b5749ed3baed"} Mar 16 00:31:51 crc kubenswrapper[4983]: I0316 00:31:51.370126 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=4.010286671 podStartE2EDuration="37.370112309s" podCreationTimestamp="2026-03-16 00:31:14 +0000 UTC" firstStartedPulling="2026-03-16 00:31:17.583649896 +0000 UTC m=+1486.183748326" lastFinishedPulling="2026-03-16 00:31:50.943475544 +0000 UTC m=+1519.543573964" observedRunningTime="2026-03-16 00:31:51.369238195 +0000 UTC m=+1519.969336625" watchObservedRunningTime="2026-03-16 00:31:51.370112309 +0000 UTC m=+1519.970210739" Mar 16 00:31:52 crc kubenswrapper[4983]: I0316 00:31:52.148354 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:52 crc kubenswrapper[4983]: I0316 00:31:52.155235 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:52 crc kubenswrapper[4983]: I0316 00:31:52.156704 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Mar 16 00:31:52 crc kubenswrapper[4983]: I0316 00:31:52.211888 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:53 crc kubenswrapper[4983]: I0316 00:31:53.350607 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9"] Mar 16 00:31:53 crc kubenswrapper[4983]: I0316 00:31:53.448492 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:31:53 crc kubenswrapper[4983]: I0316 00:31:53.449120 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:31:54 crc kubenswrapper[4983]: I0316 00:31:54.391471 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f21bd8c3-505c-465a-afeb-404a9136ea58","Type":"ContainerStarted","Data":"c517c932658f217915b64832431b2f1667ae56507da615ca5f0b78adca4bb3e9"} Mar 16 00:31:54 crc kubenswrapper[4983]: I0316 00:31:54.413191 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerStarted","Data":"0cc7aaee559d458119dc2ee5a3e2cb02a2dfd2bf9ba1ad2ded84686f30c14aa1"} Mar 16 00:31:54 crc kubenswrapper[4983]: I0316 00:31:54.413238 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerStarted","Data":"93baf073698f37ccdc048dccbc446c7dc8f440d38e9cecc4747f7232434b84ea"} Mar 16 00:31:54 crc kubenswrapper[4983]: I0316 00:31:54.413248 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerStarted","Data":"f0eb1123813d7cd48be0a4e4e53b889743af2cc9ad2a2317f163bc2779f92851"} Mar 16 00:31:54 crc kubenswrapper[4983]: I0316 00:31:54.418809 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerStarted","Data":"de0cb00eb368a83860a045f9347c998a8a509933602b7ea228fb491b6a620d63"} Mar 16 00:31:54 crc kubenswrapper[4983]: I0316 00:31:54.422183 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerStarted","Data":"d1aead081db4f20ffd00d7a3d89630f4d9a6f1c75b1a2c6f1033ef02a2e687b6"} Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.079844 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79"] Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.081020 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.084257 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.084487 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.092849 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79"] Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.203682 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.203975 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.204147 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.204275 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98h7r\" (UniqueName: \"kubernetes.io/projected/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-kube-api-access-98h7r\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.306161 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.307060 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.307094 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98h7r\" (UniqueName: \"kubernetes.io/projected/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-kube-api-access-98h7r\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.307178 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.307538 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.308122 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.313317 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.323508 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98h7r\" (UniqueName: \"kubernetes.io/projected/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-kube-api-access-98h7r\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.407091 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.154543 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v"] Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.155937 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.159069 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.163485 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v"] Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.227373 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lwnw\" (UniqueName: \"kubernetes.io/projected/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-kube-api-access-8lwnw\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.227552 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.227658 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.227787 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.329162 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.329218 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.329256 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.329311 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lwnw\" (UniqueName: \"kubernetes.io/projected/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-kube-api-access-8lwnw\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.330046 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.330377 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.338647 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.345297 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lwnw\" (UniqueName: \"kubernetes.io/projected/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-kube-api-access-8lwnw\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.450067 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f21bd8c3-505c-465a-afeb-404a9136ea58","Type":"ContainerStarted","Data":"9aed0ed8df66e785bc9d5d66f4079e1e38b7174bef7af9b734679dda61f08c58"} Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.484134 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.423502 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v"] Mar 16 00:31:59 crc kubenswrapper[4983]: W0316 00:31:59.432120 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4d42cf2_f1fd_4aa7_b950_0d60911c50af.slice/crio-d0e051ca78b302732e8e8f63a4d7f022c9f8e966621cb79720eb2f037144d233 WatchSource:0}: Error finding container d0e051ca78b302732e8e8f63a4d7f022c9f8e966621cb79720eb2f037144d233: Status 404 returned error can't find the container with id d0e051ca78b302732e8e8f63a4d7f022c9f8e966621cb79720eb2f037144d233 Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.506179 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerStarted","Data":"f7840ecdf4fcdf60af5f4eb8211546c563ed5fb0e52ea433c19b1d3a98a46270"} Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.509179 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerStarted","Data":"0aa8a0dec936622d4540fb518c56ffa469d81190d27e045f40b84e9b81140e2b"} Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.510481 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerStarted","Data":"d0e051ca78b302732e8e8f63a4d7f022c9f8e966621cb79720eb2f037144d233"} Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.511659 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerStarted","Data":"ca9a8afe432f12da01937c3d80e779bbcdd39612200bd7c66290ae71a8c956f2"} Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.536468 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" podStartSLOduration=5.069774346 podStartE2EDuration="15.536442357s" podCreationTimestamp="2026-03-16 00:31:44 +0000 UTC" firstStartedPulling="2026-03-16 00:31:48.675991858 +0000 UTC m=+1517.276090288" lastFinishedPulling="2026-03-16 00:31:59.142659869 +0000 UTC m=+1527.742758299" observedRunningTime="2026-03-16 00:31:59.528201247 +0000 UTC m=+1528.128299687" watchObservedRunningTime="2026-03-16 00:31:59.536442357 +0000 UTC m=+1528.136540787" Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.579521 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" podStartSLOduration=3.7774968859999998 podStartE2EDuration="9.579500036s" podCreationTimestamp="2026-03-16 00:31:50 +0000 UTC" firstStartedPulling="2026-03-16 00:31:53.364025053 +0000 UTC m=+1521.964123483" lastFinishedPulling="2026-03-16 00:31:59.166028203 +0000 UTC m=+1527.766126633" observedRunningTime="2026-03-16 00:31:59.558076274 +0000 UTC m=+1528.158174704" watchObservedRunningTime="2026-03-16 00:31:59.579500036 +0000 UTC m=+1528.179598466" Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.584602 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" podStartSLOduration=3.2272827 podStartE2EDuration="18.584576831s" podCreationTimestamp="2026-03-16 00:31:41 +0000 UTC" firstStartedPulling="2026-03-16 00:31:43.781936207 +0000 UTC m=+1512.382034647" lastFinishedPulling="2026-03-16 00:31:59.139230348 +0000 UTC m=+1527.739328778" observedRunningTime="2026-03-16 00:31:59.582052364 +0000 UTC m=+1528.182150794" watchObservedRunningTime="2026-03-16 00:31:59.584576831 +0000 UTC m=+1528.184675281" Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.716165 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79"] Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.158086 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560352-h2jj5"] Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.159271 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.161188 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.161353 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.161470 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.167306 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-h2jj5"] Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.198013 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9xfh\" (UniqueName: \"kubernetes.io/projected/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf-kube-api-access-c9xfh\") pod \"auto-csr-approver-29560352-h2jj5\" (UID: \"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf\") " pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.300020 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9xfh\" (UniqueName: \"kubernetes.io/projected/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf-kube-api-access-c9xfh\") pod \"auto-csr-approver-29560352-h2jj5\" (UID: \"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf\") " pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.331560 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9xfh\" (UniqueName: \"kubernetes.io/projected/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf-kube-api-access-c9xfh\") pod \"auto-csr-approver-29560352-h2jj5\" (UID: \"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf\") " pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.474862 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.523496 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerStarted","Data":"8820119474c4b4aaeaf00a16224f6b12b6a7323d3aa4cea1c91c92b863c312e2"} Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.523539 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerStarted","Data":"9d60a4524a919c756cb4c09d07a1048e63fe633d565763fc762848b2f27837f4"} Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.523548 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerStarted","Data":"7d26c3e5619765fa1e99d6abcec28a53dc87906630a769d555889ac29a33178f"} Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.529573 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f21bd8c3-505c-465a-afeb-404a9136ea58","Type":"ContainerStarted","Data":"b391a427f5aec737b75d773134851b88c11bb653f27ae28c101fa30d4149a75b"} Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.534119 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerStarted","Data":"7d1f02eba871e4e5fe005c8076ae7879a4fafc6fb5e7fb9bbde386f0b69b76ac"} Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.534162 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerStarted","Data":"53f4487b14468a504bb34d307152a174c835e495a8ccf6a08ce9083e42f32a9a"} Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.559451 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" podStartSLOduration=5.108382089 podStartE2EDuration="5.559405434s" podCreationTimestamp="2026-03-16 00:31:55 +0000 UTC" firstStartedPulling="2026-03-16 00:31:59.783122909 +0000 UTC m=+1528.383221339" lastFinishedPulling="2026-03-16 00:32:00.234146254 +0000 UTC m=+1528.834244684" observedRunningTime="2026-03-16 00:32:00.555048677 +0000 UTC m=+1529.155147117" watchObservedRunningTime="2026-03-16 00:32:00.559405434 +0000 UTC m=+1529.159503864" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.581632 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" podStartSLOduration=4.109260681 podStartE2EDuration="4.581612366s" podCreationTimestamp="2026-03-16 00:31:56 +0000 UTC" firstStartedPulling="2026-03-16 00:31:59.443492556 +0000 UTC m=+1528.043590986" lastFinishedPulling="2026-03-16 00:31:59.915844241 +0000 UTC m=+1528.515942671" observedRunningTime="2026-03-16 00:32:00.57501768 +0000 UTC m=+1529.175116130" watchObservedRunningTime="2026-03-16 00:32:00.581612366 +0000 UTC m=+1529.181710796" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.649411 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=15.303882348 podStartE2EDuration="33.649390115s" podCreationTimestamp="2026-03-16 00:31:27 +0000 UTC" firstStartedPulling="2026-03-16 00:31:41.243651205 +0000 UTC m=+1509.843749636" lastFinishedPulling="2026-03-16 00:31:59.589158973 +0000 UTC m=+1528.189257403" observedRunningTime="2026-03-16 00:32:00.61997671 +0000 UTC m=+1529.220075140" watchObservedRunningTime="2026-03-16 00:32:00.649390115 +0000 UTC m=+1529.249488545" Mar 16 00:32:01 crc kubenswrapper[4983]: I0316 00:32:01.091743 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-h2jj5"] Mar 16 00:32:01 crc kubenswrapper[4983]: I0316 00:32:01.542612 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" event={"ID":"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf","Type":"ContainerStarted","Data":"1a24252820d3b1761c9e6b612d0d04c82239e2b635f6875b1ac06e6defe006a0"} Mar 16 00:32:02 crc kubenswrapper[4983]: I0316 00:32:02.156813 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 16 00:32:02 crc kubenswrapper[4983]: I0316 00:32:02.192893 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 16 00:32:02 crc kubenswrapper[4983]: I0316 00:32:02.574145 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" event={"ID":"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf","Type":"ContainerStarted","Data":"e9ad9d465cd26c47636106767bbd93622a4df6a39436eee8b4a72f1c036d34fd"} Mar 16 00:32:02 crc kubenswrapper[4983]: I0316 00:32:02.585672 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" podStartSLOduration=1.479191047 podStartE2EDuration="2.585658672s" podCreationTimestamp="2026-03-16 00:32:00 +0000 UTC" firstStartedPulling="2026-03-16 00:32:01.082249725 +0000 UTC m=+1529.682348155" lastFinishedPulling="2026-03-16 00:32:02.18871735 +0000 UTC m=+1530.788815780" observedRunningTime="2026-03-16 00:32:02.583921666 +0000 UTC m=+1531.184020106" watchObservedRunningTime="2026-03-16 00:32:02.585658672 +0000 UTC m=+1531.185757102" Mar 16 00:32:02 crc kubenswrapper[4983]: I0316 00:32:02.614192 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 16 00:32:03 crc kubenswrapper[4983]: I0316 00:32:03.580478 4983 generic.go:334] "Generic (PLEG): container finished" podID="69be3986-5dfe-49cd-a9c9-8bde7c59eaaf" containerID="e9ad9d465cd26c47636106767bbd93622a4df6a39436eee8b4a72f1c036d34fd" exitCode=0 Mar 16 00:32:03 crc kubenswrapper[4983]: I0316 00:32:03.580545 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" event={"ID":"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf","Type":"ContainerDied","Data":"e9ad9d465cd26c47636106767bbd93622a4df6a39436eee8b4a72f1c036d34fd"} Mar 16 00:32:04 crc kubenswrapper[4983]: I0316 00:32:04.884434 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:04 crc kubenswrapper[4983]: I0316 00:32:04.974296 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9xfh\" (UniqueName: \"kubernetes.io/projected/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf-kube-api-access-c9xfh\") pod \"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf\" (UID: \"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf\") " Mar 16 00:32:04 crc kubenswrapper[4983]: I0316 00:32:04.981630 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf-kube-api-access-c9xfh" (OuterVolumeSpecName: "kube-api-access-c9xfh") pod "69be3986-5dfe-49cd-a9c9-8bde7c59eaaf" (UID: "69be3986-5dfe-49cd-a9c9-8bde7c59eaaf"). InnerVolumeSpecName "kube-api-access-c9xfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:05 crc kubenswrapper[4983]: I0316 00:32:05.076265 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9xfh\" (UniqueName: \"kubernetes.io/projected/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf-kube-api-access-c9xfh\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:05 crc kubenswrapper[4983]: I0316 00:32:05.178006 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-xpbzq"] Mar 16 00:32:05 crc kubenswrapper[4983]: I0316 00:32:05.186788 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-xpbzq"] Mar 16 00:32:05 crc kubenswrapper[4983]: I0316 00:32:05.594663 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" event={"ID":"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf","Type":"ContainerDied","Data":"1a24252820d3b1761c9e6b612d0d04c82239e2b635f6875b1ac06e6defe006a0"} Mar 16 00:32:05 crc kubenswrapper[4983]: I0316 00:32:05.594701 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:05 crc kubenswrapper[4983]: I0316 00:32:05.594710 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a24252820d3b1761c9e6b612d0d04c82239e2b635f6875b1ac06e6defe006a0" Mar 16 00:32:06 crc kubenswrapper[4983]: I0316 00:32:06.102262 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb0fb90-2fa7-4376-8997-678868e0832a" path="/var/lib/kubelet/pods/3cb0fb90-2fa7-4376-8997-678868e0832a/volumes" Mar 16 00:32:06 crc kubenswrapper[4983]: I0316 00:32:06.890866 4983 scope.go:117] "RemoveContainer" containerID="2ff36755f0536b7a806a46f7132557ff7e7da3301af8403a26d90d34c8f6b2e8" Mar 16 00:32:08 crc kubenswrapper[4983]: I0316 00:32:08.735190 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p4tqx"] Mar 16 00:32:08 crc kubenswrapper[4983]: I0316 00:32:08.735729 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" podUID="b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" containerName="default-interconnect" containerID="cri-o://9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235" gracePeriod=30 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.086982 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142376 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-credentials\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142526 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-users\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142576 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wcbv\" (UniqueName: \"kubernetes.io/projected/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-kube-api-access-4wcbv\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142639 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-ca\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142705 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-config\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142746 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-credentials\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142793 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-ca\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.143770 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.147578 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.151957 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.151995 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.158256 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.158282 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-kube-api-access-4wcbv" (OuterVolumeSpecName: "kube-api-access-4wcbv") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "kube-api-access-4wcbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.158381 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245272 4983 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245299 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wcbv\" (UniqueName: \"kubernetes.io/projected/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-kube-api-access-4wcbv\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245311 4983 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245320 4983 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245329 4983 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245339 4983 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245348 4983 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.628389 4983 generic.go:334] "Generic (PLEG): container finished" podID="e4d42cf2-f1fd-4aa7-b950-0d60911c50af" containerID="53f4487b14468a504bb34d307152a174c835e495a8ccf6a08ce9083e42f32a9a" exitCode=0 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.628431 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerDied","Data":"53f4487b14468a504bb34d307152a174c835e495a8ccf6a08ce9083e42f32a9a"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.629328 4983 scope.go:117] "RemoveContainer" containerID="53f4487b14468a504bb34d307152a174c835e495a8ccf6a08ce9083e42f32a9a" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.630641 4983 generic.go:334] "Generic (PLEG): container finished" podID="b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" containerID="9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235" exitCode=0 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.630705 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" event={"ID":"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3","Type":"ContainerDied","Data":"9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.630737 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" event={"ID":"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3","Type":"ContainerDied","Data":"914a76d3eee89820d29491da1b012ee2154bcaedef1aaca3a8fafb3b8f5f37b8"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.630775 4983 scope.go:117] "RemoveContainer" containerID="9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.630916 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.636839 4983 generic.go:334] "Generic (PLEG): container finished" podID="f90f8a5e-67de-4058-9e42-0caf957b6b71" containerID="0cc7aaee559d458119dc2ee5a3e2cb02a2dfd2bf9ba1ad2ded84686f30c14aa1" exitCode=0 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.636936 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerDied","Data":"0cc7aaee559d458119dc2ee5a3e2cb02a2dfd2bf9ba1ad2ded84686f30c14aa1"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.637495 4983 scope.go:117] "RemoveContainer" containerID="0cc7aaee559d458119dc2ee5a3e2cb02a2dfd2bf9ba1ad2ded84686f30c14aa1" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.654161 4983 generic.go:334] "Generic (PLEG): container finished" podID="c6f00393-4848-47e5-8836-3e3b9c3a5b95" containerID="de0cb00eb368a83860a045f9347c998a8a509933602b7ea228fb491b6a620d63" exitCode=0 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.654330 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerDied","Data":"de0cb00eb368a83860a045f9347c998a8a509933602b7ea228fb491b6a620d63"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.655082 4983 scope.go:117] "RemoveContainer" containerID="de0cb00eb368a83860a045f9347c998a8a509933602b7ea228fb491b6a620d63" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.664954 4983 scope.go:117] "RemoveContainer" containerID="9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235" Mar 16 00:32:09 crc kubenswrapper[4983]: E0316 00:32:09.666122 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235\": container with ID starting with 9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235 not found: ID does not exist" containerID="9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.666185 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235"} err="failed to get container status \"9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235\": rpc error: code = NotFound desc = could not find container \"9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235\": container with ID starting with 9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235 not found: ID does not exist" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.688259 4983 generic.go:334] "Generic (PLEG): container finished" podID="1b5da06a-5282-43dc-b876-76eb99ba6f9d" containerID="d1aead081db4f20ffd00d7a3d89630f4d9a6f1c75b1a2c6f1033ef02a2e687b6" exitCode=0 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.688412 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerDied","Data":"d1aead081db4f20ffd00d7a3d89630f4d9a6f1c75b1a2c6f1033ef02a2e687b6"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.689282 4983 scope.go:117] "RemoveContainer" containerID="d1aead081db4f20ffd00d7a3d89630f4d9a6f1c75b1a2c6f1033ef02a2e687b6" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.693650 4983 generic.go:334] "Generic (PLEG): container finished" podID="c395c954-b7f5-4ec0-be3d-29c8cec19fb1" containerID="9d60a4524a919c756cb4c09d07a1048e63fe633d565763fc762848b2f27837f4" exitCode=0 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.693691 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerDied","Data":"9d60a4524a919c756cb4c09d07a1048e63fe633d565763fc762848b2f27837f4"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.700264 4983 scope.go:117] "RemoveContainer" containerID="9d60a4524a919c756cb4c09d07a1048e63fe633d565763fc762848b2f27837f4" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.752682 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p4tqx"] Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.765643 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p4tqx"] Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.101941 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" path="/var/lib/kubelet/pods/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3/volumes" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.612335 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-4bn8h"] Mar 16 00:32:10 crc kubenswrapper[4983]: E0316 00:32:10.612664 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" containerName="default-interconnect" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.612684 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" containerName="default-interconnect" Mar 16 00:32:10 crc kubenswrapper[4983]: E0316 00:32:10.612699 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69be3986-5dfe-49cd-a9c9-8bde7c59eaaf" containerName="oc" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.612707 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="69be3986-5dfe-49cd-a9c9-8bde7c59eaaf" containerName="oc" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.612872 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" containerName="default-interconnect" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.612894 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="69be3986-5dfe-49cd-a9c9-8bde7c59eaaf" containerName="oc" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.613408 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.617839 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.618014 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.618116 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.618043 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.618356 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.618387 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.622085 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-kdvm9" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.624027 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-4bn8h"] Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.665892 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.666171 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-sasl-users\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.666274 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft6pw\" (UniqueName: \"kubernetes.io/projected/765c4911-7ef9-4466-9038-70c84c21e731-kube-api-access-ft6pw\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.666377 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/765c4911-7ef9-4466-9038-70c84c21e731-sasl-config\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.666484 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.666518 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.666584 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.703009 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerStarted","Data":"4c940ecdbbdce3e920e03b490137756189110f424fca1610e244a105f302f620"} Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.708887 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerStarted","Data":"12e3dbf6647554e91a651ecc8cb405dd418f0b287a4bf41cee0f2269408df3b1"} Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.712989 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerStarted","Data":"5010f633de52a0f3be196b392748208626babeeef07821c69f59d1b0cb4b0798"} Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.729319 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerStarted","Data":"02d6fd17b80381846864343ca49e33edc5379e281f42fd70ea7317cccec5b89a"} Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.732332 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerStarted","Data":"9f800cb9f5e1de88d76583d5ff674f85b7732ea6e246d9d79d7811e17cd31b15"} Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.768899 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/765c4911-7ef9-4466-9038-70c84c21e731-sasl-config\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.768979 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.769020 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.769119 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.769203 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.769323 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-sasl-users\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.769374 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft6pw\" (UniqueName: \"kubernetes.io/projected/765c4911-7ef9-4466-9038-70c84c21e731-kube-api-access-ft6pw\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.770154 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/765c4911-7ef9-4466-9038-70c84c21e731-sasl-config\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.778879 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.779461 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.788837 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft6pw\" (UniqueName: \"kubernetes.io/projected/765c4911-7ef9-4466-9038-70c84c21e731-kube-api-access-ft6pw\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.821617 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.822956 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-sasl-users\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.824651 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.937485 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.478601 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-4bn8h"] Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.743595 4983 generic.go:334] "Generic (PLEG): container finished" podID="1b5da06a-5282-43dc-b876-76eb99ba6f9d" containerID="02d6fd17b80381846864343ca49e33edc5379e281f42fd70ea7317cccec5b89a" exitCode=0 Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.743682 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerDied","Data":"02d6fd17b80381846864343ca49e33edc5379e281f42fd70ea7317cccec5b89a"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.743749 4983 scope.go:117] "RemoveContainer" containerID="d1aead081db4f20ffd00d7a3d89630f4d9a6f1c75b1a2c6f1033ef02a2e687b6" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.744426 4983 scope.go:117] "RemoveContainer" containerID="02d6fd17b80381846864343ca49e33edc5379e281f42fd70ea7317cccec5b89a" Mar 16 00:32:11 crc kubenswrapper[4983]: E0316 00:32:11.744689 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82_service-telemetry(1b5da06a-5282-43dc-b876-76eb99ba6f9d)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" podUID="1b5da06a-5282-43dc-b876-76eb99ba6f9d" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.747032 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" event={"ID":"765c4911-7ef9-4466-9038-70c84c21e731","Type":"ContainerStarted","Data":"6b9eaa0bce094cfdcf8bcb97f816da561bc2bdad17e78f44927011bfff8426fe"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.747079 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" event={"ID":"765c4911-7ef9-4466-9038-70c84c21e731","Type":"ContainerStarted","Data":"16b6adc4d9fe3612f3abb3da989439a47527f6a10a92c02041043cc98af2121f"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.752907 4983 generic.go:334] "Generic (PLEG): container finished" podID="c395c954-b7f5-4ec0-be3d-29c8cec19fb1" containerID="9f800cb9f5e1de88d76583d5ff674f85b7732ea6e246d9d79d7811e17cd31b15" exitCode=0 Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.753004 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerDied","Data":"9f800cb9f5e1de88d76583d5ff674f85b7732ea6e246d9d79d7811e17cd31b15"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.754082 4983 scope.go:117] "RemoveContainer" containerID="9f800cb9f5e1de88d76583d5ff674f85b7732ea6e246d9d79d7811e17cd31b15" Mar 16 00:32:11 crc kubenswrapper[4983]: E0316 00:32:11.754388 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-7cd756799d-8ld79_service-telemetry(c395c954-b7f5-4ec0-be3d-29c8cec19fb1)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" podUID="c395c954-b7f5-4ec0-be3d-29c8cec19fb1" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.759343 4983 generic.go:334] "Generic (PLEG): container finished" podID="e4d42cf2-f1fd-4aa7-b950-0d60911c50af" containerID="4c940ecdbbdce3e920e03b490137756189110f424fca1610e244a105f302f620" exitCode=0 Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.759424 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerDied","Data":"4c940ecdbbdce3e920e03b490137756189110f424fca1610e244a105f302f620"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.760098 4983 scope.go:117] "RemoveContainer" containerID="4c940ecdbbdce3e920e03b490137756189110f424fca1610e244a105f302f620" Mar 16 00:32:11 crc kubenswrapper[4983]: E0316 00:32:11.760410 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v_service-telemetry(e4d42cf2-f1fd-4aa7-b950-0d60911c50af)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" podUID="e4d42cf2-f1fd-4aa7-b950-0d60911c50af" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.773195 4983 scope.go:117] "RemoveContainer" containerID="9d60a4524a919c756cb4c09d07a1048e63fe633d565763fc762848b2f27837f4" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.775578 4983 generic.go:334] "Generic (PLEG): container finished" podID="f90f8a5e-67de-4058-9e42-0caf957b6b71" containerID="12e3dbf6647554e91a651ecc8cb405dd418f0b287a4bf41cee0f2269408df3b1" exitCode=0 Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.775681 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerDied","Data":"12e3dbf6647554e91a651ecc8cb405dd418f0b287a4bf41cee0f2269408df3b1"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.776519 4983 scope.go:117] "RemoveContainer" containerID="12e3dbf6647554e91a651ecc8cb405dd418f0b287a4bf41cee0f2269408df3b1" Mar 16 00:32:11 crc kubenswrapper[4983]: E0316 00:32:11.776804 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9_service-telemetry(f90f8a5e-67de-4058-9e42-0caf957b6b71)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" podUID="f90f8a5e-67de-4058-9e42-0caf957b6b71" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.799503 4983 generic.go:334] "Generic (PLEG): container finished" podID="c6f00393-4848-47e5-8836-3e3b9c3a5b95" containerID="5010f633de52a0f3be196b392748208626babeeef07821c69f59d1b0cb4b0798" exitCode=0 Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.799576 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerDied","Data":"5010f633de52a0f3be196b392748208626babeeef07821c69f59d1b0cb4b0798"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.802582 4983 scope.go:117] "RemoveContainer" containerID="5010f633de52a0f3be196b392748208626babeeef07821c69f59d1b0cb4b0798" Mar 16 00:32:11 crc kubenswrapper[4983]: E0316 00:32:11.805526 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf_service-telemetry(c6f00393-4848-47e5-8836-3e3b9c3a5b95)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" podUID="c6f00393-4848-47e5-8836-3e3b9c3a5b95" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.829025 4983 scope.go:117] "RemoveContainer" containerID="53f4487b14468a504bb34d307152a174c835e495a8ccf6a08ce9083e42f32a9a" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.847006 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" podStartSLOduration=3.8469824299999997 podStartE2EDuration="3.84698243s" podCreationTimestamp="2026-03-16 00:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:32:11.839015387 +0000 UTC m=+1540.439113827" watchObservedRunningTime="2026-03-16 00:32:11.84698243 +0000 UTC m=+1540.447080860" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.866670 4983 scope.go:117] "RemoveContainer" containerID="0cc7aaee559d458119dc2ee5a3e2cb02a2dfd2bf9ba1ad2ded84686f30c14aa1" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.953934 4983 scope.go:117] "RemoveContainer" containerID="de0cb00eb368a83860a045f9347c998a8a509933602b7ea228fb491b6a620d63" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.267709 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.269101 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.271146 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.271842 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.280495 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.314279 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rhhn\" (UniqueName: \"kubernetes.io/projected/3a6a664b-b239-4fa8-b423-19c1246c89cd-kube-api-access-6rhhn\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.314383 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/3a6a664b-b239-4fa8-b423-19c1246c89cd-qdr-test-config\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.314443 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/3a6a664b-b239-4fa8-b423-19c1246c89cd-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.416505 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rhhn\" (UniqueName: \"kubernetes.io/projected/3a6a664b-b239-4fa8-b423-19c1246c89cd-kube-api-access-6rhhn\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.416629 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/3a6a664b-b239-4fa8-b423-19c1246c89cd-qdr-test-config\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.416670 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/3a6a664b-b239-4fa8-b423-19c1246c89cd-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.417665 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/3a6a664b-b239-4fa8-b423-19c1246c89cd-qdr-test-config\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.426701 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/3a6a664b-b239-4fa8-b423-19c1246c89cd-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.433393 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rhhn\" (UniqueName: \"kubernetes.io/projected/3a6a664b-b239-4fa8-b423-19c1246c89cd-kube-api-access-6rhhn\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.596244 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 16 00:32:13 crc kubenswrapper[4983]: I0316 00:32:13.080830 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 16 00:32:13 crc kubenswrapper[4983]: W0316 00:32:13.085977 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a6a664b_b239_4fa8_b423_19c1246c89cd.slice/crio-71d44bd285b36d95af3b9fbcbff1be387784c9b12b8545177545ffd2cc53b717 WatchSource:0}: Error finding container 71d44bd285b36d95af3b9fbcbff1be387784c9b12b8545177545ffd2cc53b717: Status 404 returned error can't find the container with id 71d44bd285b36d95af3b9fbcbff1be387784c9b12b8545177545ffd2cc53b717 Mar 16 00:32:13 crc kubenswrapper[4983]: I0316 00:32:13.849452 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"3a6a664b-b239-4fa8-b423-19c1246c89cd","Type":"ContainerStarted","Data":"71d44bd285b36d95af3b9fbcbff1be387784c9b12b8545177545ffd2cc53b717"} Mar 16 00:32:23 crc kubenswrapper[4983]: I0316 00:32:23.448804 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:32:23 crc kubenswrapper[4983]: I0316 00:32:23.450173 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:32:23 crc kubenswrapper[4983]: I0316 00:32:23.933673 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"3a6a664b-b239-4fa8-b423-19c1246c89cd","Type":"ContainerStarted","Data":"911b626681d37633e50876be3127afad71c35e402072c125e8e7e4db145d326a"} Mar 16 00:32:23 crc kubenswrapper[4983]: I0316 00:32:23.949680 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=1.6262936190000001 podStartE2EDuration="11.949657746s" podCreationTimestamp="2026-03-16 00:32:12 +0000 UTC" firstStartedPulling="2026-03-16 00:32:13.089818464 +0000 UTC m=+1541.689916894" lastFinishedPulling="2026-03-16 00:32:23.413182591 +0000 UTC m=+1552.013281021" observedRunningTime="2026-03-16 00:32:23.946457021 +0000 UTC m=+1552.546555461" watchObservedRunningTime="2026-03-16 00:32:23.949657746 +0000 UTC m=+1552.549756196" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.093165 4983 scope.go:117] "RemoveContainer" containerID="4c940ecdbbdce3e920e03b490137756189110f424fca1610e244a105f302f620" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.281387 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-6kz94"] Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.282407 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.285219 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.285520 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.285619 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.286243 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.286294 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.286419 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.303640 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-6kz94"] Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400617 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-publisher\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400689 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400712 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-healthcheck-log\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400743 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400784 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-sensubility-config\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400809 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbv9b\" (UniqueName: \"kubernetes.io/projected/d507b81c-ccea-4bf1-9f0c-55266c51bc27-kube-api-access-zbv9b\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400828 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-config\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.502781 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-config\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.502878 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-publisher\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.502946 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.502976 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-healthcheck-log\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.503023 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.503056 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-sensubility-config\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.503093 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbv9b\" (UniqueName: \"kubernetes.io/projected/d507b81c-ccea-4bf1-9f0c-55266c51bc27-kube-api-access-zbv9b\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.504785 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-publisher\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.505032 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-healthcheck-log\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.505492 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-sensubility-config\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.505721 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-config\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.506280 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.506611 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.531448 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbv9b\" (UniqueName: \"kubernetes.io/projected/d507b81c-ccea-4bf1-9f0c-55266c51bc27-kube-api-access-zbv9b\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.614511 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.691658 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.692829 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.701815 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.806594 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wjfx\" (UniqueName: \"kubernetes.io/projected/d68c4bdc-1f0d-47dc-ada9-07ec54da7e85-kube-api-access-5wjfx\") pod \"curl\" (UID: \"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85\") " pod="service-telemetry/curl" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.907578 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wjfx\" (UniqueName: \"kubernetes.io/projected/d68c4bdc-1f0d-47dc-ada9-07ec54da7e85-kube-api-access-5wjfx\") pod \"curl\" (UID: \"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85\") " pod="service-telemetry/curl" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.927296 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wjfx\" (UniqueName: \"kubernetes.io/projected/d68c4bdc-1f0d-47dc-ada9-07ec54da7e85-kube-api-access-5wjfx\") pod \"curl\" (UID: \"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85\") " pod="service-telemetry/curl" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.943795 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerStarted","Data":"f9375a05af6f467146d3013de939726abd7db3ac804b12efd272b5b6653ea1ff"} Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.008551 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.098156 4983 scope.go:117] "RemoveContainer" containerID="02d6fd17b80381846864343ca49e33edc5379e281f42fd70ea7317cccec5b89a" Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.154150 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-6kz94"] Mar 16 00:32:25 crc kubenswrapper[4983]: W0316 00:32:25.164787 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd507b81c_ccea_4bf1_9f0c_55266c51bc27.slice/crio-a77fd65b4f8a0f40d75d34e2263e5dff4a988a6a5fd9e03c81b3feb36727d922 WatchSource:0}: Error finding container a77fd65b4f8a0f40d75d34e2263e5dff4a988a6a5fd9e03c81b3feb36727d922: Status 404 returned error can't find the container with id a77fd65b4f8a0f40d75d34e2263e5dff4a988a6a5fd9e03c81b3feb36727d922 Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.285717 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.952404 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-6kz94" event={"ID":"d507b81c-ccea-4bf1-9f0c-55266c51bc27","Type":"ContainerStarted","Data":"a77fd65b4f8a0f40d75d34e2263e5dff4a988a6a5fd9e03c81b3feb36727d922"} Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.954702 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerStarted","Data":"55708cf206dfc8b11c0836b329aadaa86340d09904619d2b4794423920841dc7"} Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.957358 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85","Type":"ContainerStarted","Data":"abb62b80bf96acf3ca0f52f2c57f7d56c28aacf8192a83c8a83908a8b55945c5"} Mar 16 00:32:26 crc kubenswrapper[4983]: I0316 00:32:26.093416 4983 scope.go:117] "RemoveContainer" containerID="5010f633de52a0f3be196b392748208626babeeef07821c69f59d1b0cb4b0798" Mar 16 00:32:26 crc kubenswrapper[4983]: I0316 00:32:26.094093 4983 scope.go:117] "RemoveContainer" containerID="12e3dbf6647554e91a651ecc8cb405dd418f0b287a4bf41cee0f2269408df3b1" Mar 16 00:32:27 crc kubenswrapper[4983]: I0316 00:32:27.092682 4983 scope.go:117] "RemoveContainer" containerID="9f800cb9f5e1de88d76583d5ff674f85b7732ea6e246d9d79d7811e17cd31b15" Mar 16 00:32:27 crc kubenswrapper[4983]: I0316 00:32:27.976824 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerStarted","Data":"aeabd7899ca69f7686c59db0d40cdd077c528d5939bade6b5b522978c1c199c2"} Mar 16 00:32:27 crc kubenswrapper[4983]: I0316 00:32:27.977992 4983 generic.go:334] "Generic (PLEG): container finished" podID="d68c4bdc-1f0d-47dc-ada9-07ec54da7e85" containerID="b6013509887d2e2f4faebc882cfa5c7abb1f33f5f915bce9e2025431e12a7234" exitCode=0 Mar 16 00:32:27 crc kubenswrapper[4983]: I0316 00:32:27.978031 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85","Type":"ContainerDied","Data":"b6013509887d2e2f4faebc882cfa5c7abb1f33f5f915bce9e2025431e12a7234"} Mar 16 00:32:27 crc kubenswrapper[4983]: I0316 00:32:27.981030 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerStarted","Data":"511bee0fe29e1cdb6f01989df7c4f201112407b0103fda5b1d647405bbaf9609"} Mar 16 00:32:27 crc kubenswrapper[4983]: I0316 00:32:27.983030 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerStarted","Data":"df18d3eedeadf2e925899f3ccaaac110cd7077a13a6d2d24c7a34e0f78e95d16"} Mar 16 00:32:31 crc kubenswrapper[4983]: I0316 00:32:31.774285 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:32:31 crc kubenswrapper[4983]: I0316 00:32:31.917129 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wjfx\" (UniqueName: \"kubernetes.io/projected/d68c4bdc-1f0d-47dc-ada9-07ec54da7e85-kube-api-access-5wjfx\") pod \"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85\" (UID: \"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85\") " Mar 16 00:32:31 crc kubenswrapper[4983]: I0316 00:32:31.922417 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d68c4bdc-1f0d-47dc-ada9-07ec54da7e85-kube-api-access-5wjfx" (OuterVolumeSpecName: "kube-api-access-5wjfx") pod "d68c4bdc-1f0d-47dc-ada9-07ec54da7e85" (UID: "d68c4bdc-1f0d-47dc-ada9-07ec54da7e85"). InnerVolumeSpecName "kube-api-access-5wjfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:31 crc kubenswrapper[4983]: I0316 00:32:31.966134 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_d68c4bdc-1f0d-47dc-ada9-07ec54da7e85/curl/0.log" Mar 16 00:32:32 crc kubenswrapper[4983]: I0316 00:32:32.011559 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85","Type":"ContainerDied","Data":"abb62b80bf96acf3ca0f52f2c57f7d56c28aacf8192a83c8a83908a8b55945c5"} Mar 16 00:32:32 crc kubenswrapper[4983]: I0316 00:32:32.011601 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abb62b80bf96acf3ca0f52f2c57f7d56c28aacf8192a83c8a83908a8b55945c5" Mar 16 00:32:32 crc kubenswrapper[4983]: I0316 00:32:32.011662 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:32:32 crc kubenswrapper[4983]: I0316 00:32:32.019038 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wjfx\" (UniqueName: \"kubernetes.io/projected/d68c4bdc-1f0d-47dc-ada9-07ec54da7e85-kube-api-access-5wjfx\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:32 crc kubenswrapper[4983]: I0316 00:32:32.249039 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-4h8qf_1f2a95c7-282e-4200-ac63-1a114726205b/prometheus-webhook-snmp/0.log" Mar 16 00:32:37 crc kubenswrapper[4983]: I0316 00:32:37.048390 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-6kz94" event={"ID":"d507b81c-ccea-4bf1-9f0c-55266c51bc27","Type":"ContainerStarted","Data":"7ba03515b7feec00fc3cc152ded030cc67b02f774e8b7318ac17702be069987f"} Mar 16 00:32:42 crc kubenswrapper[4983]: I0316 00:32:42.111850 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-6kz94" event={"ID":"d507b81c-ccea-4bf1-9f0c-55266c51bc27","Type":"ContainerStarted","Data":"89d73a24a08dcc769de73df504ef03e9eb6f0444dcaf2637db15f3aa6ddfe562"} Mar 16 00:32:42 crc kubenswrapper[4983]: I0316 00:32:42.128734 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-6kz94" podStartSLOduration=1.425926246 podStartE2EDuration="18.128720032s" podCreationTimestamp="2026-03-16 00:32:24 +0000 UTC" firstStartedPulling="2026-03-16 00:32:25.169272219 +0000 UTC m=+1553.769370649" lastFinishedPulling="2026-03-16 00:32:41.872066015 +0000 UTC m=+1570.472164435" observedRunningTime="2026-03-16 00:32:42.128056895 +0000 UTC m=+1570.728155325" watchObservedRunningTime="2026-03-16 00:32:42.128720032 +0000 UTC m=+1570.728818462" Mar 16 00:32:53 crc kubenswrapper[4983]: I0316 00:32:53.448475 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:32:53 crc kubenswrapper[4983]: I0316 00:32:53.449079 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:32:53 crc kubenswrapper[4983]: I0316 00:32:53.449132 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:32:53 crc kubenswrapper[4983]: I0316 00:32:53.449785 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:32:53 crc kubenswrapper[4983]: I0316 00:32:53.449854 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" gracePeriod=600 Mar 16 00:32:54 crc kubenswrapper[4983]: E0316 00:32:54.152131 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:32:54 crc kubenswrapper[4983]: I0316 00:32:54.186985 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" exitCode=0 Mar 16 00:32:54 crc kubenswrapper[4983]: I0316 00:32:54.187298 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0"} Mar 16 00:32:54 crc kubenswrapper[4983]: I0316 00:32:54.187441 4983 scope.go:117] "RemoveContainer" containerID="5ccb6fbab37bc5f699825790ea94936a8112c1dca902322ca17a08342dea1350" Mar 16 00:32:54 crc kubenswrapper[4983]: I0316 00:32:54.187713 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:32:54 crc kubenswrapper[4983]: E0316 00:32:54.188007 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:33:02 crc kubenswrapper[4983]: I0316 00:33:02.387460 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-4h8qf_1f2a95c7-282e-4200-ac63-1a114726205b/prometheus-webhook-snmp/0.log" Mar 16 00:33:08 crc kubenswrapper[4983]: I0316 00:33:08.093124 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:33:08 crc kubenswrapper[4983]: E0316 00:33:08.093979 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:33:10 crc kubenswrapper[4983]: I0316 00:33:10.316746 4983 generic.go:334] "Generic (PLEG): container finished" podID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerID="7ba03515b7feec00fc3cc152ded030cc67b02f774e8b7318ac17702be069987f" exitCode=0 Mar 16 00:33:10 crc kubenswrapper[4983]: I0316 00:33:10.316807 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-6kz94" event={"ID":"d507b81c-ccea-4bf1-9f0c-55266c51bc27","Type":"ContainerDied","Data":"7ba03515b7feec00fc3cc152ded030cc67b02f774e8b7318ac17702be069987f"} Mar 16 00:33:10 crc kubenswrapper[4983]: I0316 00:33:10.317377 4983 scope.go:117] "RemoveContainer" containerID="7ba03515b7feec00fc3cc152ded030cc67b02f774e8b7318ac17702be069987f" Mar 16 00:33:14 crc kubenswrapper[4983]: I0316 00:33:14.358163 4983 generic.go:334] "Generic (PLEG): container finished" podID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerID="89d73a24a08dcc769de73df504ef03e9eb6f0444dcaf2637db15f3aa6ddfe562" exitCode=0 Mar 16 00:33:14 crc kubenswrapper[4983]: I0316 00:33:14.358241 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-6kz94" event={"ID":"d507b81c-ccea-4bf1-9f0c-55266c51bc27","Type":"ContainerDied","Data":"89d73a24a08dcc769de73df504ef03e9eb6f0444dcaf2637db15f3aa6ddfe562"} Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.616219 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.716536 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-healthcheck-log\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.716670 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-config\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.717349 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-entrypoint-script\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.717377 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-sensubility-config\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.717397 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbv9b\" (UniqueName: \"kubernetes.io/projected/d507b81c-ccea-4bf1-9f0c-55266c51bc27-kube-api-access-zbv9b\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.717420 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-entrypoint-script\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.717518 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-publisher\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.722112 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d507b81c-ccea-4bf1-9f0c-55266c51bc27-kube-api-access-zbv9b" (OuterVolumeSpecName: "kube-api-access-zbv9b") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "kube-api-access-zbv9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.733523 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.734934 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.735185 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.736204 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.738234 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.746335 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819301 4983 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819326 4983 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819334 4983 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819342 4983 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819351 4983 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819359 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbv9b\" (UniqueName: \"kubernetes.io/projected/d507b81c-ccea-4bf1-9f0c-55266c51bc27-kube-api-access-zbv9b\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819367 4983 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:16 crc kubenswrapper[4983]: I0316 00:33:16.375085 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-6kz94" event={"ID":"d507b81c-ccea-4bf1-9f0c-55266c51bc27","Type":"ContainerDied","Data":"a77fd65b4f8a0f40d75d34e2263e5dff4a988a6a5fd9e03c81b3feb36727d922"} Mar 16 00:33:16 crc kubenswrapper[4983]: I0316 00:33:16.375478 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a77fd65b4f8a0f40d75d34e2263e5dff4a988a6a5fd9e03c81b3feb36727d922" Mar 16 00:33:16 crc kubenswrapper[4983]: I0316 00:33:16.375182 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:33:17 crc kubenswrapper[4983]: I0316 00:33:17.586790 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-6kz94_d507b81c-ccea-4bf1-9f0c-55266c51bc27/smoketest-collectd/0.log" Mar 16 00:33:17 crc kubenswrapper[4983]: I0316 00:33:17.830858 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-6kz94_d507b81c-ccea-4bf1-9f0c-55266c51bc27/smoketest-ceilometer/0.log" Mar 16 00:33:18 crc kubenswrapper[4983]: I0316 00:33:18.098868 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-4bn8h_765c4911-7ef9-4466-9038-70c84c21e731/default-interconnect/0.log" Mar 16 00:33:18 crc kubenswrapper[4983]: I0316 00:33:18.350507 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82_1b5da06a-5282-43dc-b876-76eb99ba6f9d/bridge/2.log" Mar 16 00:33:18 crc kubenswrapper[4983]: I0316 00:33:18.599360 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82_1b5da06a-5282-43dc-b876-76eb99ba6f9d/sg-core/0.log" Mar 16 00:33:18 crc kubenswrapper[4983]: I0316 00:33:18.862950 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7cd756799d-8ld79_c395c954-b7f5-4ec0-be3d-29c8cec19fb1/bridge/2.log" Mar 16 00:33:19 crc kubenswrapper[4983]: I0316 00:33:19.119554 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7cd756799d-8ld79_c395c954-b7f5-4ec0-be3d-29c8cec19fb1/sg-core/0.log" Mar 16 00:33:19 crc kubenswrapper[4983]: I0316 00:33:19.402987 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf_c6f00393-4848-47e5-8836-3e3b9c3a5b95/bridge/2.log" Mar 16 00:33:19 crc kubenswrapper[4983]: I0316 00:33:19.668629 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf_c6f00393-4848-47e5-8836-3e3b9c3a5b95/sg-core/0.log" Mar 16 00:33:19 crc kubenswrapper[4983]: I0316 00:33:19.955196 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v_e4d42cf2-f1fd-4aa7-b950-0d60911c50af/bridge/2.log" Mar 16 00:33:20 crc kubenswrapper[4983]: I0316 00:33:20.211089 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v_e4d42cf2-f1fd-4aa7-b950-0d60911c50af/sg-core/0.log" Mar 16 00:33:20 crc kubenswrapper[4983]: I0316 00:33:20.465146 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9_f90f8a5e-67de-4058-9e42-0caf957b6b71/bridge/2.log" Mar 16 00:33:20 crc kubenswrapper[4983]: I0316 00:33:20.695395 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9_f90f8a5e-67de-4058-9e42-0caf957b6b71/sg-core/0.log" Mar 16 00:33:23 crc kubenswrapper[4983]: I0316 00:33:23.092387 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:33:23 crc kubenswrapper[4983]: E0316 00:33:23.092950 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:33:24 crc kubenswrapper[4983]: I0316 00:33:24.264501 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-64877956d4-ljbdp_5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d/operator/0.log" Mar 16 00:33:24 crc kubenswrapper[4983]: I0316 00:33:24.549370 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_4c952d0a-6462-4081-8603-935847aefe14/prometheus/0.log" Mar 16 00:33:24 crc kubenswrapper[4983]: I0316 00:33:24.816551 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e/elasticsearch/0.log" Mar 16 00:33:25 crc kubenswrapper[4983]: I0316 00:33:25.104105 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-4h8qf_1f2a95c7-282e-4200-ac63-1a114726205b/prometheus-webhook-snmp/0.log" Mar 16 00:33:25 crc kubenswrapper[4983]: I0316 00:33:25.360812 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_f21bd8c3-505c-465a-afeb-404a9136ea58/alertmanager/0.log" Mar 16 00:33:38 crc kubenswrapper[4983]: I0316 00:33:38.092645 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:33:38 crc kubenswrapper[4983]: E0316 00:33:38.093342 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:33:39 crc kubenswrapper[4983]: I0316 00:33:39.284006 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-65fdb44596-qnp9k_b8f4edcf-0403-4d59-b045-e618c6aabff5/operator/0.log" Mar 16 00:33:42 crc kubenswrapper[4983]: I0316 00:33:42.521900 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-64877956d4-ljbdp_5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d/operator/0.log" Mar 16 00:33:42 crc kubenswrapper[4983]: I0316 00:33:42.799127 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_3a6a664b-b239-4fa8-b423-19c1246c89cd/qdr/0.log" Mar 16 00:33:52 crc kubenswrapper[4983]: I0316 00:33:52.100540 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:33:52 crc kubenswrapper[4983]: E0316 00:33:52.101420 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.159007 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560354-x6s7n"] Mar 16 00:34:00 crc kubenswrapper[4983]: E0316 00:34:00.159895 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerName="smoketest-collectd" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.159912 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerName="smoketest-collectd" Mar 16 00:34:00 crc kubenswrapper[4983]: E0316 00:34:00.159926 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d68c4bdc-1f0d-47dc-ada9-07ec54da7e85" containerName="curl" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.159945 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d68c4bdc-1f0d-47dc-ada9-07ec54da7e85" containerName="curl" Mar 16 00:34:00 crc kubenswrapper[4983]: E0316 00:34:00.159958 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerName="smoketest-ceilometer" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.159967 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerName="smoketest-ceilometer" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.162360 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerName="smoketest-ceilometer" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.162482 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d68c4bdc-1f0d-47dc-ada9-07ec54da7e85" containerName="curl" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.162583 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerName="smoketest-collectd" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.163305 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.166314 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.166165 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.166836 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560354-x6s7n"] Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.166893 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.190528 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2d7n\" (UniqueName: \"kubernetes.io/projected/34558a42-5623-4304-982a-7eb50d175b2d-kube-api-access-m2d7n\") pod \"auto-csr-approver-29560354-x6s7n\" (UID: \"34558a42-5623-4304-982a-7eb50d175b2d\") " pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.291810 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2d7n\" (UniqueName: \"kubernetes.io/projected/34558a42-5623-4304-982a-7eb50d175b2d-kube-api-access-m2d7n\") pod \"auto-csr-approver-29560354-x6s7n\" (UID: \"34558a42-5623-4304-982a-7eb50d175b2d\") " pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.312544 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2d7n\" (UniqueName: \"kubernetes.io/projected/34558a42-5623-4304-982a-7eb50d175b2d-kube-api-access-m2d7n\") pod \"auto-csr-approver-29560354-x6s7n\" (UID: \"34558a42-5623-4304-982a-7eb50d175b2d\") " pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.488595 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.881327 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560354-x6s7n"] Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.895402 4983 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:34:01 crc kubenswrapper[4983]: I0316 00:34:01.735721 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" event={"ID":"34558a42-5623-4304-982a-7eb50d175b2d","Type":"ContainerStarted","Data":"b96276156b81bf2f529428d5a801e008a094c9f73f65bbc568358821bf0d4c60"} Mar 16 00:34:02 crc kubenswrapper[4983]: I0316 00:34:02.745312 4983 generic.go:334] "Generic (PLEG): container finished" podID="34558a42-5623-4304-982a-7eb50d175b2d" containerID="7cbb072030f4c1287101d25e5a82e3b25ae4dbd2cfe9ee92a6c85b6bbd3dd66e" exitCode=0 Mar 16 00:34:02 crc kubenswrapper[4983]: I0316 00:34:02.745501 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" event={"ID":"34558a42-5623-4304-982a-7eb50d175b2d","Type":"ContainerDied","Data":"7cbb072030f4c1287101d25e5a82e3b25ae4dbd2cfe9ee92a6c85b6bbd3dd66e"} Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.015562 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.092382 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:34:04 crc kubenswrapper[4983]: E0316 00:34:04.092692 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.174718 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2d7n\" (UniqueName: \"kubernetes.io/projected/34558a42-5623-4304-982a-7eb50d175b2d-kube-api-access-m2d7n\") pod \"34558a42-5623-4304-982a-7eb50d175b2d\" (UID: \"34558a42-5623-4304-982a-7eb50d175b2d\") " Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.179780 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34558a42-5623-4304-982a-7eb50d175b2d-kube-api-access-m2d7n" (OuterVolumeSpecName: "kube-api-access-m2d7n") pod "34558a42-5623-4304-982a-7eb50d175b2d" (UID: "34558a42-5623-4304-982a-7eb50d175b2d"). InnerVolumeSpecName "kube-api-access-m2d7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.276360 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2d7n\" (UniqueName: \"kubernetes.io/projected/34558a42-5623-4304-982a-7eb50d175b2d-kube-api-access-m2d7n\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.764467 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" event={"ID":"34558a42-5623-4304-982a-7eb50d175b2d","Type":"ContainerDied","Data":"b96276156b81bf2f529428d5a801e008a094c9f73f65bbc568358821bf0d4c60"} Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.764794 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b96276156b81bf2f529428d5a801e008a094c9f73f65bbc568358821bf0d4c60" Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.764513 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:05 crc kubenswrapper[4983]: I0316 00:34:05.077639 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-bfz7x"] Mar 16 00:34:05 crc kubenswrapper[4983]: I0316 00:34:05.084733 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-bfz7x"] Mar 16 00:34:06 crc kubenswrapper[4983]: I0316 00:34:06.104441 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee9b49cf-d10f-4047-aa75-b89a01652d64" path="/var/lib/kubelet/pods/ee9b49cf-d10f-4047-aa75-b89a01652d64/volumes" Mar 16 00:34:06 crc kubenswrapper[4983]: I0316 00:34:06.982548 4983 scope.go:117] "RemoveContainer" containerID="e7a4f48409b5b7c54ae90135f67eb66bb8944343dc341edc9f4aefa5c3b5777f" Mar 16 00:34:07 crc kubenswrapper[4983]: I0316 00:34:07.354138 4983 scope.go:117] "RemoveContainer" containerID="78e4a4ba6d978cbff10ef4b981fae6b3ca9ed9e86332628a7ba7f28e90fd7ea7" Mar 16 00:34:07 crc kubenswrapper[4983]: I0316 00:34:07.393443 4983 scope.go:117] "RemoveContainer" containerID="591b456456f261ac0f2b9814ab85dfd85135db5fb073dd674dfeacb880c9917a" Mar 16 00:34:16 crc kubenswrapper[4983]: I0316 00:34:16.093921 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:34:16 crc kubenswrapper[4983]: E0316 00:34:16.095364 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.364956 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ldkzc/must-gather-dnctp"] Mar 16 00:34:17 crc kubenswrapper[4983]: E0316 00:34:17.366269 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34558a42-5623-4304-982a-7eb50d175b2d" containerName="oc" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.366426 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="34558a42-5623-4304-982a-7eb50d175b2d" containerName="oc" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.366749 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="34558a42-5623-4304-982a-7eb50d175b2d" containerName="oc" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.372009 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.376561 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ldkzc"/"default-dockercfg-75pn8" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.376847 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ldkzc"/"openshift-service-ca.crt" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.377130 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ldkzc"/"kube-root-ca.crt" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.394475 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ldkzc/must-gather-dnctp"] Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.462791 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2qxc\" (UniqueName: \"kubernetes.io/projected/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-kube-api-access-p2qxc\") pod \"must-gather-dnctp\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.462871 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-must-gather-output\") pod \"must-gather-dnctp\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.564613 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2qxc\" (UniqueName: \"kubernetes.io/projected/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-kube-api-access-p2qxc\") pod \"must-gather-dnctp\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.564695 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-must-gather-output\") pod \"must-gather-dnctp\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.565118 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-must-gather-output\") pod \"must-gather-dnctp\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.593373 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2qxc\" (UniqueName: \"kubernetes.io/projected/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-kube-api-access-p2qxc\") pod \"must-gather-dnctp\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.691634 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:18 crc kubenswrapper[4983]: I0316 00:34:18.136829 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ldkzc/must-gather-dnctp"] Mar 16 00:34:18 crc kubenswrapper[4983]: I0316 00:34:18.884371 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldkzc/must-gather-dnctp" event={"ID":"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736","Type":"ContainerStarted","Data":"30df828c4c51d9f6aeee032cd0b5e15fea4fb9fd1e7c5083f2b3af2a39f13cc5"} Mar 16 00:34:24 crc kubenswrapper[4983]: I0316 00:34:24.942530 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldkzc/must-gather-dnctp" event={"ID":"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736","Type":"ContainerStarted","Data":"be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2"} Mar 16 00:34:24 crc kubenswrapper[4983]: I0316 00:34:24.943018 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldkzc/must-gather-dnctp" event={"ID":"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736","Type":"ContainerStarted","Data":"186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b"} Mar 16 00:34:24 crc kubenswrapper[4983]: I0316 00:34:24.961558 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ldkzc/must-gather-dnctp" podStartSLOduration=1.616939627 podStartE2EDuration="7.961540845s" podCreationTimestamp="2026-03-16 00:34:17 +0000 UTC" firstStartedPulling="2026-03-16 00:34:18.148472876 +0000 UTC m=+1666.748571306" lastFinishedPulling="2026-03-16 00:34:24.493074094 +0000 UTC m=+1673.093172524" observedRunningTime="2026-03-16 00:34:24.953968433 +0000 UTC m=+1673.554066863" watchObservedRunningTime="2026-03-16 00:34:24.961540845 +0000 UTC m=+1673.561639265" Mar 16 00:34:30 crc kubenswrapper[4983]: I0316 00:34:30.092204 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:34:30 crc kubenswrapper[4983]: E0316 00:34:30.092733 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:34:43 crc kubenswrapper[4983]: I0316 00:34:43.092916 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:34:43 crc kubenswrapper[4983]: E0316 00:34:43.093614 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:34:54 crc kubenswrapper[4983]: I0316 00:34:54.092309 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:34:54 crc kubenswrapper[4983]: E0316 00:34:54.093104 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:35:07 crc kubenswrapper[4983]: I0316 00:35:07.093371 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:35:07 crc kubenswrapper[4983]: E0316 00:35:07.094096 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:35:07 crc kubenswrapper[4983]: I0316 00:35:07.459089 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-5pqgr_bb33b891-4cdb-4fc1-95e4-2895f40fdb7a/control-plane-machine-set-operator/0.log" Mar 16 00:35:07 crc kubenswrapper[4983]: I0316 00:35:07.607483 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t4lj8_6b0e4e23-a158-4597-b005-db088a652ec8/kube-rbac-proxy/0.log" Mar 16 00:35:07 crc kubenswrapper[4983]: I0316 00:35:07.639489 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t4lj8_6b0e4e23-a158-4597-b005-db088a652ec8/machine-api-operator/0.log" Mar 16 00:35:18 crc kubenswrapper[4983]: I0316 00:35:18.992505 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-cwdn9_95208db3-d53d-43c0-9b2c-cc4c5b3236d8/cert-manager-controller/0.log" Mar 16 00:35:19 crc kubenswrapper[4983]: I0316 00:35:19.124714 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-g9j58_4c66c255-b5f4-4c72-8902-7225df93821d/cert-manager-cainjector/0.log" Mar 16 00:35:19 crc kubenswrapper[4983]: I0316 00:35:19.179677 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-jbjkj_1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef/cert-manager-webhook/0.log" Mar 16 00:35:22 crc kubenswrapper[4983]: I0316 00:35:22.096257 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:35:22 crc kubenswrapper[4983]: E0316 00:35:22.097021 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:35:32 crc kubenswrapper[4983]: I0316 00:35:32.988611 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-sn6x8_2af5ec54-bcc4-45f5-839a-135da91513a2/prometheus-operator/0.log" Mar 16 00:35:33 crc kubenswrapper[4983]: I0316 00:35:33.092609 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7c86566d45-kx26j_30d188b9-ab98-47a3-8143-3f58ae611dd6/prometheus-operator-admission-webhook/0.log" Mar 16 00:35:33 crc kubenswrapper[4983]: I0316 00:35:33.223053 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd_7e065fa9-405e-452b-bfe7-c4920a8577db/prometheus-operator-admission-webhook/0.log" Mar 16 00:35:33 crc kubenswrapper[4983]: I0316 00:35:33.284169 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-c99mb_05523d68-53d9-4cc5-a02b-5221a2396606/operator/0.log" Mar 16 00:35:33 crc kubenswrapper[4983]: I0316 00:35:33.428786 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-dmdpt_8eb6b056-16ea-46db-b8ea-fd17a717a8e4/perses-operator/0.log" Mar 16 00:35:35 crc kubenswrapper[4983]: I0316 00:35:35.092788 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:35:35 crc kubenswrapper[4983]: E0316 00:35:35.093186 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.308362 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/util/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.460504 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/util/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.478041 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/pull/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.502872 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/pull/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.614950 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/pull/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.626172 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/util/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.700810 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/extract/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.793680 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/util/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.994563 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/pull/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.994616 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/pull/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.005055 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/util/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.140358 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/extract/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.153211 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/pull/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.154088 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/util/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.322019 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/util/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.466393 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/util/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.500022 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/pull/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.531998 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/pull/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.664699 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/util/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.689013 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/extract/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.717098 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/pull/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.837420 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/util/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.029383 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/pull/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.042831 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/util/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.047837 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/pull/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.092744 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:35:48 crc kubenswrapper[4983]: E0316 00:35:48.092992 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.202744 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/util/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.241414 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/pull/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.243601 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/extract/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.399669 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/extract-utilities/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.529897 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/extract-utilities/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.560823 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/extract-content/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.574809 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/extract-content/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.713834 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/extract-utilities/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.723252 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/extract-content/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.945332 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/extract-utilities/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.955842 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/registry-server/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.066454 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/extract-utilities/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.082515 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/extract-content/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.132996 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/extract-content/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.314012 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/extract-content/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.320056 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/extract-utilities/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.505276 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-pvjtd_46c9f8c6-7d08-47e7-866d-7f359e8683be/marketplace-operator/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.558386 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/registry-server/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.633549 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/extract-utilities/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.776239 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/extract-utilities/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.795464 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/extract-content/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.824366 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/extract-content/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.946809 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/extract-utilities/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.954248 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/extract-content/0.log" Mar 16 00:35:50 crc kubenswrapper[4983]: I0316 00:35:50.154319 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/registry-server/0.log" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.092451 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:36:00 crc kubenswrapper[4983]: E0316 00:36:00.093308 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.155463 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560356-67f7r"] Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.156462 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.158936 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.159131 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.160277 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.160400 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560356-67f7r"] Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.202729 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg26z\" (UniqueName: \"kubernetes.io/projected/e68b39b9-3d4f-4927-86de-6ee746b6165c-kube-api-access-lg26z\") pod \"auto-csr-approver-29560356-67f7r\" (UID: \"e68b39b9-3d4f-4927-86de-6ee746b6165c\") " pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.304118 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg26z\" (UniqueName: \"kubernetes.io/projected/e68b39b9-3d4f-4927-86de-6ee746b6165c-kube-api-access-lg26z\") pod \"auto-csr-approver-29560356-67f7r\" (UID: \"e68b39b9-3d4f-4927-86de-6ee746b6165c\") " pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.322208 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg26z\" (UniqueName: \"kubernetes.io/projected/e68b39b9-3d4f-4927-86de-6ee746b6165c-kube-api-access-lg26z\") pod \"auto-csr-approver-29560356-67f7r\" (UID: \"e68b39b9-3d4f-4927-86de-6ee746b6165c\") " pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.471318 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.937301 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560356-67f7r"] Mar 16 00:36:01 crc kubenswrapper[4983]: I0316 00:36:01.079652 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-sn6x8_2af5ec54-bcc4-45f5-839a-135da91513a2/prometheus-operator/0.log" Mar 16 00:36:01 crc kubenswrapper[4983]: I0316 00:36:01.125854 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd_7e065fa9-405e-452b-bfe7-c4920a8577db/prometheus-operator-admission-webhook/0.log" Mar 16 00:36:01 crc kubenswrapper[4983]: I0316 00:36:01.149012 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7c86566d45-kx26j_30d188b9-ab98-47a3-8143-3f58ae611dd6/prometheus-operator-admission-webhook/0.log" Mar 16 00:36:01 crc kubenswrapper[4983]: I0316 00:36:01.257375 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-c99mb_05523d68-53d9-4cc5-a02b-5221a2396606/operator/0.log" Mar 16 00:36:01 crc kubenswrapper[4983]: I0316 00:36:01.258007 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-dmdpt_8eb6b056-16ea-46db-b8ea-fd17a717a8e4/perses-operator/0.log" Mar 16 00:36:01 crc kubenswrapper[4983]: I0316 00:36:01.625202 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-67f7r" event={"ID":"e68b39b9-3d4f-4927-86de-6ee746b6165c","Type":"ContainerStarted","Data":"68cb14a2ffacefe11dbdef620c798b43906d8e7b2d884334faa869500824a5a5"} Mar 16 00:36:02 crc kubenswrapper[4983]: I0316 00:36:02.632468 4983 generic.go:334] "Generic (PLEG): container finished" podID="e68b39b9-3d4f-4927-86de-6ee746b6165c" containerID="577063654d19910ce0b39d5b8b0ea00c99859a562caea958335aa79b85b42df2" exitCode=0 Mar 16 00:36:02 crc kubenswrapper[4983]: I0316 00:36:02.632522 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-67f7r" event={"ID":"e68b39b9-3d4f-4927-86de-6ee746b6165c","Type":"ContainerDied","Data":"577063654d19910ce0b39d5b8b0ea00c99859a562caea958335aa79b85b42df2"} Mar 16 00:36:03 crc kubenswrapper[4983]: I0316 00:36:03.880822 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:03 crc kubenswrapper[4983]: I0316 00:36:03.957715 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg26z\" (UniqueName: \"kubernetes.io/projected/e68b39b9-3d4f-4927-86de-6ee746b6165c-kube-api-access-lg26z\") pod \"e68b39b9-3d4f-4927-86de-6ee746b6165c\" (UID: \"e68b39b9-3d4f-4927-86de-6ee746b6165c\") " Mar 16 00:36:03 crc kubenswrapper[4983]: I0316 00:36:03.971948 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e68b39b9-3d4f-4927-86de-6ee746b6165c-kube-api-access-lg26z" (OuterVolumeSpecName: "kube-api-access-lg26z") pod "e68b39b9-3d4f-4927-86de-6ee746b6165c" (UID: "e68b39b9-3d4f-4927-86de-6ee746b6165c"). InnerVolumeSpecName "kube-api-access-lg26z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:36:04 crc kubenswrapper[4983]: I0316 00:36:04.059730 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg26z\" (UniqueName: \"kubernetes.io/projected/e68b39b9-3d4f-4927-86de-6ee746b6165c-kube-api-access-lg26z\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:04 crc kubenswrapper[4983]: I0316 00:36:04.646802 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-67f7r" event={"ID":"e68b39b9-3d4f-4927-86de-6ee746b6165c","Type":"ContainerDied","Data":"68cb14a2ffacefe11dbdef620c798b43906d8e7b2d884334faa869500824a5a5"} Mar 16 00:36:04 crc kubenswrapper[4983]: I0316 00:36:04.646839 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68cb14a2ffacefe11dbdef620c798b43906d8e7b2d884334faa869500824a5a5" Mar 16 00:36:04 crc kubenswrapper[4983]: I0316 00:36:04.646890 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:04 crc kubenswrapper[4983]: I0316 00:36:04.957742 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-spjzd"] Mar 16 00:36:04 crc kubenswrapper[4983]: I0316 00:36:04.966943 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-spjzd"] Mar 16 00:36:06 crc kubenswrapper[4983]: I0316 00:36:06.101532 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57c14e51-5c0b-467c-ba79-ac6f39239445" path="/var/lib/kubelet/pods/57c14e51-5c0b-467c-ba79-ac6f39239445/volumes" Mar 16 00:36:07 crc kubenswrapper[4983]: I0316 00:36:07.473251 4983 scope.go:117] "RemoveContainer" containerID="1accee3cbf491221453bc50f3bfc4fc15297e3a876200e21860d5dd4e3e66686" Mar 16 00:36:15 crc kubenswrapper[4983]: I0316 00:36:15.092251 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:36:15 crc kubenswrapper[4983]: E0316 00:36:15.093087 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:36:30 crc kubenswrapper[4983]: I0316 00:36:30.093980 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:36:30 crc kubenswrapper[4983]: E0316 00:36:30.094719 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:36:44 crc kubenswrapper[4983]: I0316 00:36:44.092808 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:36:44 crc kubenswrapper[4983]: E0316 00:36:44.093632 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:36:53 crc kubenswrapper[4983]: I0316 00:36:53.332599 4983 generic.go:334] "Generic (PLEG): container finished" podID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerID="186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b" exitCode=0 Mar 16 00:36:53 crc kubenswrapper[4983]: I0316 00:36:53.332700 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldkzc/must-gather-dnctp" event={"ID":"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736","Type":"ContainerDied","Data":"186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b"} Mar 16 00:36:53 crc kubenswrapper[4983]: I0316 00:36:53.337905 4983 scope.go:117] "RemoveContainer" containerID="186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b" Mar 16 00:36:54 crc kubenswrapper[4983]: I0316 00:36:54.203560 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ldkzc_must-gather-dnctp_dfbc5d99-14f4-4c00-bb4e-a20b27fbd736/gather/0.log" Mar 16 00:36:55 crc kubenswrapper[4983]: I0316 00:36:55.094229 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:36:55 crc kubenswrapper[4983]: E0316 00:36:55.094859 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:37:00 crc kubenswrapper[4983]: I0316 00:37:00.949551 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ldkzc/must-gather-dnctp"] Mar 16 00:37:00 crc kubenswrapper[4983]: I0316 00:37:00.950232 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ldkzc/must-gather-dnctp" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="copy" containerID="cri-o://be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2" gracePeriod=2 Mar 16 00:37:00 crc kubenswrapper[4983]: I0316 00:37:00.955510 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ldkzc/must-gather-dnctp"] Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.295902 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ldkzc_must-gather-dnctp_dfbc5d99-14f4-4c00-bb4e-a20b27fbd736/copy/0.log" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.296948 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.314300 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2qxc\" (UniqueName: \"kubernetes.io/projected/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-kube-api-access-p2qxc\") pod \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.314435 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-must-gather-output\") pod \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.322418 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-kube-api-access-p2qxc" (OuterVolumeSpecName: "kube-api-access-p2qxc") pod "dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" (UID: "dfbc5d99-14f4-4c00-bb4e-a20b27fbd736"). InnerVolumeSpecName "kube-api-access-p2qxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.394132 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" (UID: "dfbc5d99-14f4-4c00-bb4e-a20b27fbd736"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.398656 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ldkzc_must-gather-dnctp_dfbc5d99-14f4-4c00-bb4e-a20b27fbd736/copy/0.log" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.400284 4983 generic.go:334] "Generic (PLEG): container finished" podID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerID="be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2" exitCode=143 Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.400347 4983 scope.go:117] "RemoveContainer" containerID="be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.406912 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.417456 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2qxc\" (UniqueName: \"kubernetes.io/projected/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-kube-api-access-p2qxc\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.417491 4983 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.433626 4983 scope.go:117] "RemoveContainer" containerID="186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.490514 4983 scope.go:117] "RemoveContainer" containerID="be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2" Mar 16 00:37:01 crc kubenswrapper[4983]: E0316 00:37:01.490998 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2\": container with ID starting with be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2 not found: ID does not exist" containerID="be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.491098 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2"} err="failed to get container status \"be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2\": rpc error: code = NotFound desc = could not find container \"be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2\": container with ID starting with be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2 not found: ID does not exist" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.491192 4983 scope.go:117] "RemoveContainer" containerID="186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b" Mar 16 00:37:01 crc kubenswrapper[4983]: E0316 00:37:01.491669 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b\": container with ID starting with 186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b not found: ID does not exist" containerID="186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.491714 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b"} err="failed to get container status \"186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b\": rpc error: code = NotFound desc = could not find container \"186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b\": container with ID starting with 186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b not found: ID does not exist" Mar 16 00:37:02 crc kubenswrapper[4983]: I0316 00:37:02.100659 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" path="/var/lib/kubelet/pods/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736/volumes" Mar 16 00:37:10 crc kubenswrapper[4983]: I0316 00:37:10.093397 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:37:10 crc kubenswrapper[4983]: E0316 00:37:10.094602 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:37:24 crc kubenswrapper[4983]: I0316 00:37:24.092688 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:37:24 crc kubenswrapper[4983]: E0316 00:37:24.093438 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:37:37 crc kubenswrapper[4983]: I0316 00:37:37.092718 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:37:37 crc kubenswrapper[4983]: E0316 00:37:37.093610 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:37:50 crc kubenswrapper[4983]: I0316 00:37:50.093221 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:37:50 crc kubenswrapper[4983]: E0316 00:37:50.094052 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.156408 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560358-ns9g8"] Mar 16 00:38:00 crc kubenswrapper[4983]: E0316 00:38:00.157422 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="gather" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.157444 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="gather" Mar 16 00:38:00 crc kubenswrapper[4983]: E0316 00:38:00.157464 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e68b39b9-3d4f-4927-86de-6ee746b6165c" containerName="oc" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.157477 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="e68b39b9-3d4f-4927-86de-6ee746b6165c" containerName="oc" Mar 16 00:38:00 crc kubenswrapper[4983]: E0316 00:38:00.157507 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="copy" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.157519 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="copy" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.157722 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="e68b39b9-3d4f-4927-86de-6ee746b6165c" containerName="oc" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.157748 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="gather" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.157795 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="copy" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.158475 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.160721 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.162924 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.171607 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.177067 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560358-ns9g8"] Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.296039 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26knm\" (UniqueName: \"kubernetes.io/projected/b5014785-5ff5-4e7a-9347-12767e48ce8b-kube-api-access-26knm\") pod \"auto-csr-approver-29560358-ns9g8\" (UID: \"b5014785-5ff5-4e7a-9347-12767e48ce8b\") " pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.398164 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26knm\" (UniqueName: \"kubernetes.io/projected/b5014785-5ff5-4e7a-9347-12767e48ce8b-kube-api-access-26knm\") pod \"auto-csr-approver-29560358-ns9g8\" (UID: \"b5014785-5ff5-4e7a-9347-12767e48ce8b\") " pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.421430 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26knm\" (UniqueName: \"kubernetes.io/projected/b5014785-5ff5-4e7a-9347-12767e48ce8b-kube-api-access-26knm\") pod \"auto-csr-approver-29560358-ns9g8\" (UID: \"b5014785-5ff5-4e7a-9347-12767e48ce8b\") " pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.497552 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:01 crc kubenswrapper[4983]: I0316 00:38:01.010699 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560358-ns9g8"] Mar 16 00:38:01 crc kubenswrapper[4983]: W0316 00:38:01.013313 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5014785_5ff5_4e7a_9347_12767e48ce8b.slice/crio-52fa8c55f542cae13d5898de6d1213791a4b112a008bd9c51f526d82b48f6031 WatchSource:0}: Error finding container 52fa8c55f542cae13d5898de6d1213791a4b112a008bd9c51f526d82b48f6031: Status 404 returned error can't find the container with id 52fa8c55f542cae13d5898de6d1213791a4b112a008bd9c51f526d82b48f6031 Mar 16 00:38:01 crc kubenswrapper[4983]: I0316 00:38:01.898357 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" event={"ID":"b5014785-5ff5-4e7a-9347-12767e48ce8b","Type":"ContainerStarted","Data":"52fa8c55f542cae13d5898de6d1213791a4b112a008bd9c51f526d82b48f6031"} Mar 16 00:38:02 crc kubenswrapper[4983]: I0316 00:38:02.912178 4983 generic.go:334] "Generic (PLEG): container finished" podID="b5014785-5ff5-4e7a-9347-12767e48ce8b" containerID="20046d3f82f54e83f8b5bad0847dd44a1c63394206a4e16905a8f088a3bee614" exitCode=0 Mar 16 00:38:02 crc kubenswrapper[4983]: I0316 00:38:02.912305 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" event={"ID":"b5014785-5ff5-4e7a-9347-12767e48ce8b","Type":"ContainerDied","Data":"20046d3f82f54e83f8b5bad0847dd44a1c63394206a4e16905a8f088a3bee614"} Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.095178 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.233830 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.360173 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26knm\" (UniqueName: \"kubernetes.io/projected/b5014785-5ff5-4e7a-9347-12767e48ce8b-kube-api-access-26knm\") pod \"b5014785-5ff5-4e7a-9347-12767e48ce8b\" (UID: \"b5014785-5ff5-4e7a-9347-12767e48ce8b\") " Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.367652 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5014785-5ff5-4e7a-9347-12767e48ce8b-kube-api-access-26knm" (OuterVolumeSpecName: "kube-api-access-26knm") pod "b5014785-5ff5-4e7a-9347-12767e48ce8b" (UID: "b5014785-5ff5-4e7a-9347-12767e48ce8b"). InnerVolumeSpecName "kube-api-access-26knm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.461500 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26knm\" (UniqueName: \"kubernetes.io/projected/b5014785-5ff5-4e7a-9347-12767e48ce8b-kube-api-access-26knm\") on node \"crc\" DevicePath \"\"" Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.930621 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"a6c7ce0a8fa5ee2f9b1b96037cb7d6454f8ffd8ed071ec717005e5711d4eceb0"} Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.932859 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" event={"ID":"b5014785-5ff5-4e7a-9347-12767e48ce8b","Type":"ContainerDied","Data":"52fa8c55f542cae13d5898de6d1213791a4b112a008bd9c51f526d82b48f6031"} Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.932913 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52fa8c55f542cae13d5898de6d1213791a4b112a008bd9c51f526d82b48f6031" Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.932953 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:05 crc kubenswrapper[4983]: I0316 00:38:05.297290 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-h2jj5"] Mar 16 00:38:05 crc kubenswrapper[4983]: I0316 00:38:05.304215 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-h2jj5"] Mar 16 00:38:06 crc kubenswrapper[4983]: I0316 00:38:06.113186 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69be3986-5dfe-49cd-a9c9-8bde7c59eaaf" path="/var/lib/kubelet/pods/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf/volumes" Mar 16 00:38:07 crc kubenswrapper[4983]: I0316 00:38:07.583298 4983 scope.go:117] "RemoveContainer" containerID="e9ad9d465cd26c47636106767bbd93622a4df6a39436eee8b4a72f1c036d34fd" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.172256 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560360-b78zr"] Mar 16 00:40:00 crc kubenswrapper[4983]: E0316 00:40:00.175509 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5014785-5ff5-4e7a-9347-12767e48ce8b" containerName="oc" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.175562 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5014785-5ff5-4e7a-9347-12767e48ce8b" containerName="oc" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.176024 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5014785-5ff5-4e7a-9347-12767e48ce8b" containerName="oc" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.176938 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-b78zr" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.180797 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.180933 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.181167 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560360-b78zr"] Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.186659 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.241692 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s8l5\" (UniqueName: \"kubernetes.io/projected/46434756-a999-477f-9c06-3727a0752b16-kube-api-access-4s8l5\") pod \"auto-csr-approver-29560360-b78zr\" (UID: \"46434756-a999-477f-9c06-3727a0752b16\") " pod="openshift-infra/auto-csr-approver-29560360-b78zr" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.343582 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s8l5\" (UniqueName: \"kubernetes.io/projected/46434756-a999-477f-9c06-3727a0752b16-kube-api-access-4s8l5\") pod \"auto-csr-approver-29560360-b78zr\" (UID: \"46434756-a999-477f-9c06-3727a0752b16\") " pod="openshift-infra/auto-csr-approver-29560360-b78zr" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.374710 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s8l5\" (UniqueName: \"kubernetes.io/projected/46434756-a999-477f-9c06-3727a0752b16-kube-api-access-4s8l5\") pod \"auto-csr-approver-29560360-b78zr\" (UID: \"46434756-a999-477f-9c06-3727a0752b16\") " pod="openshift-infra/auto-csr-approver-29560360-b78zr" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.509323 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-b78zr" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.960019 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560360-b78zr"] Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.970583 4983 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:40:01 crc kubenswrapper[4983]: I0316 00:40:01.967846 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560360-b78zr" event={"ID":"46434756-a999-477f-9c06-3727a0752b16","Type":"ContainerStarted","Data":"3aa6fb62136fc0f873740625504cd1fdc9036f62386a42240131bfbed472f880"} Mar 16 00:40:02 crc kubenswrapper[4983]: I0316 00:40:02.974526 4983 generic.go:334] "Generic (PLEG): container finished" podID="46434756-a999-477f-9c06-3727a0752b16" containerID="31cbe57ea280fc680a823a7db247b537ecaa54df845636c69f7270c931a9a663" exitCode=0 Mar 16 00:40:02 crc kubenswrapper[4983]: I0316 00:40:02.974592 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560360-b78zr" event={"ID":"46434756-a999-477f-9c06-3727a0752b16","Type":"ContainerDied","Data":"31cbe57ea280fc680a823a7db247b537ecaa54df845636c69f7270c931a9a663"} Mar 16 00:40:04 crc kubenswrapper[4983]: I0316 00:40:04.325729 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-b78zr"